See 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/87/display/redirect>

Changes:


------------------------------------------
[...truncated 39.92 KB...]
> Task :sdks:java:io:google-cloud-platform:jar
> Task :sdks:java:testing:load-tests:compileJava FROM-CACHE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar
> Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar
> Task :sdks:java:container:java11:copyDockerfileDependencies
> Task :sdks:java:container:generateLicenseReport

> Task :sdks:java:container:pullLicenses
Copying already-fetched licenses from 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/reports/dependency-license>
 to 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/java_third_party_licenses>
Already using interpreter /usr/bin/python3
Using base prefix '/usr'
New python executable in 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/virtualenv/bin/python3>
Also creating executable in 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/virtualenv/bin/python>
Installing setuptools, pip, wheel...
done.
DEPRECATION: Python 3.5 reached the end of its life on September 13th, 2020. 
Please upgrade your Python as Python 3.5 is no longer maintained. pip 21.0 will 
drop support for Python 3.5 in January 2021. pip 21.0 will remove support for 
this functionality.
Collecting beautifulsoup4<5.0,>=4.9.0
  Using cached beautifulsoup4-4.10.0-py3-none-any.whl (97 kB)
Collecting future<1.0.0,>=0.16.0
  Using cached future-0.18.2-py3-none-any.whl
Collecting pyyaml<6.0.0,>=3.12
  Using cached PyYAML-5.3.1-cp35-cp35m-linux_x86_64.whl
Collecting tenacity<6.0,>=5.0.2
  Using cached tenacity-5.1.5-py2.py3-none-any.whl (34 kB)
Collecting soupsieve>1.2
  Using cached soupsieve-2.1-py3-none-any.whl (32 kB)
Collecting six>=1.9.0
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Installing collected packages: soupsieve, six, tenacity, pyyaml, future, 
beautifulsoup4
Successfully installed beautifulsoup4-4.10.0 future-0.18.2 pyyaml-5.3.1 
six-1.16.0 soupsieve-2.1 tenacity-5.1.5
Executing 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/virtualenv/bin/python>
 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py>
 
--license_index=<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/reports/dependency-license/index.json>
        
--output_dir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/java_third_party_licenses>
        
--dep_url_yaml=<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/dep_urls_java.yaml>
 
INFO:root:Pulling license for 207 dependencies using 16 threads.
INFO:root:pull_licenses_java.py succeed. It took 2.213777 seconds with 16 
threads.
Copying licenses from 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/java_third_party_licenses>
 to 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/third_party_licenses.>
Finished license_scripts.sh

> Task :sdks:java:container:java11:copyJavaThirdPartyLicenses

> Task :release:go-licenses:java:dockerRun
+ go-licenses save github.com/apache/beam/sdks/java/container 
--save_path=/output/licenses
+ go-licenses csv github.com/apache/beam/sdks/java/container
+ tee /output/licenses/list.csv
golang.org/x/net,https://go.googlesource.com/net/+/refs/heads/master/LICENSE,BSD-3-Clause
github.com/golang/protobuf,https://github.com/golang/protobuf/blob/master/LICENSE,BSD-3-Clause
google.golang.org/genproto/googleapis/rpc/status,https://github.com/googleapis/go-genproto/blob/master/LICENSE,Apache-2.0
golang.org/x/sys,https://go.googlesource.com/sys/+/refs/heads/master/LICENSE,BSD-3-Clause
github.com/apache/beam/sdks/java/container,https://github.com/apache/beam/blob/master/LICENSE,Apache-2.0
github.com/apache/beam/sdks/go/pkg/beam,https://github.com/apache/beam/blob/master/sdks/go/README.md,Apache-2.0
google.golang.org/grpc,https://github.com/grpc/grpc-go/blob/master/LICENSE,Apache-2.0
google.golang.org/protobuf,https://go.googlesource.com/protobuf/+/refs/heads/master/LICENSE,BSD-3-Clause
golang.org/x/text,https://go.googlesource.com/text/+/refs/heads/master/LICENSE,BSD-3-Clause
+ chmod -R a+w /output/licenses

> Task :release:go-licenses:java:createLicenses
> Task :sdks:java:container:java11:copyGolangLicenses
> Task :sdks:java:container:java11:dockerPrepare
> Task :sdks:java:container:java11:docker

> Task :runners:google-cloud-dataflow-java:buildAndPushDockerContainer
WARNING: `gcloud docker` will not be supported for Docker client versions above 
18.03.

As an alternative, use `gcloud auth configure-docker` to configure `docker` to
use `gcloud` as a credential helper, then use `docker` as you would for non-GCR
registries, e.g. `docker pull gcr.io/project-id/my-image`. Add
`--verbosity=error` to silence this warning: `gcloud docker
--verbosity=error -- pull gcr.io/project-id/my-image`.

See: 
https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker

The push refers to repository 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java]
88b2a674b1cb: Preparing
94616a3ac86c: Preparing
cd9ae89b52ce: Preparing
5be348f6ede0: Preparing
15a678a51300: Preparing
8588921e24f8: Preparing
892bb470a9d8: Preparing
e024f3d10ea8: Preparing
e847979838e9: Preparing
4efc8d54cbfe: Preparing
dba809b136c6: Preparing
41854e89e8a3: Preparing
3891808a925b: Preparing
d402f4f1b906: Preparing
00ef5416d927: Preparing
8555e663f65b: Preparing
d00da3cd7763: Preparing
4e61e63529c2: Preparing
799760671c38: Preparing
4efc8d54cbfe: Waiting
3891808a925b: Waiting
d402f4f1b906: Waiting
dba809b136c6: Waiting
41854e89e8a3: Waiting
8588921e24f8: Waiting
00ef5416d927: Waiting
8555e663f65b: Waiting
892bb470a9d8: Waiting
d00da3cd7763: Waiting
e024f3d10ea8: Waiting
799760671c38: Waiting
e847979838e9: Waiting
cd9ae89b52ce: Pushed
94616a3ac86c: Pushed
15a678a51300: Pushed
8588921e24f8: Pushed
88b2a674b1cb: Pushed
5be348f6ede0: Pushed
e024f3d10ea8: Pushed
e847979838e9: Pushed
3891808a925b: Layer already exists
d402f4f1b906: Layer already exists
dba809b136c6: Pushed
00ef5416d927: Layer already exists
41854e89e8a3: Pushed
892bb470a9d8: Pushed
8555e663f65b: Layer already exists
d00da3cd7763: Layer already exists
4e61e63529c2: Layer already exists
799760671c38: Layer already exists
4efc8d54cbfe: Pushed
20210912123752: digest: 
sha256:2be8dd4d067015352e3d07e79fe1a9dca506491bd7e3291240a36d73403794d5 size: 
4311

> Task :sdks:java:testing:load-tests:run
Sep 12, 2021 12:40:15 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 12, 2021 12:40:15 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from 
the classpath: will stage 195 files. Enable logging at DEBUG level to see which 
files will be staged.
Sep 12, 2021 12:40:16 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: 
ParDo(TimeMonitor)
Sep 12, 2021 12:40:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
Sep 12, 2021 12:40:22 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location 
to prepare for execution.
Sep 12, 2021 12:40:22 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 
seconds
Sep 12, 2021 12:40:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to 
gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 12, 2021 12:40:23 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <101788 bytes, hash 
d34f0453dcbb233211249c9b54eb059e289ee181c06ad3e1601ccfe3d3ad9263> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-008EU9y7IzIRJJybVOsFniie4YHAatPhYBzP49OtkmM.pb
Sep 12, 2021 12:40:24 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as 
step s1
Sep 12, 2021 12:40:24 PM 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: 
[org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@291373d3, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@372ca2d6, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3204e238, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@38ed139b, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a5272be, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58ba5b30, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4dba773d, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d9bd4da, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c58255, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@eac3a26, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@10b1a751, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53cf9c99, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b306b9f, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@142213d5, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@934b52f, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2630dbc4, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5ea4300e, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a1c3cb4, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@76ad6715, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56637cff]
Sep 12, 2021 12:40:24 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 12, 2021 12:40:24 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor) as step s3
Sep 12, 2021 12:40:24 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(ByteMonitor) as step s4
Sep 12, 2021 12:40:24 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 0 as step s5
Sep 12, 2021 12:40:24 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 1 as step s6
Sep 12, 2021 12:40:24 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 2 as step s7
Sep 12, 2021 12:40:24 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 3 as step s8
Sep 12, 2021 12:40:24 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 4 as step s9
Sep 12, 2021 12:40:24 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 5 as step s10
Sep 12, 2021 12:40:24 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 6 as step s11
Sep 12, 2021 12:40:24 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 7 as step s12
Sep 12, 2021 12:40:24 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 8 as step s13
Sep 12, 2021 12:40:24 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 9 as step s14
Sep 12, 2021 12:40:24 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor)2 as step s15
Sep 12, 2021 12:40:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 12, 2021 12:40:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_05_40_24-10635049069975718521?project=apache-beam-testing
Sep 12, 2021 12:40:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-12_05_40_24-10635049069975718521
Sep 12, 2021 12:40:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
> --region=us-central1 2021-09-12_05_40_24-10635049069975718521
Sep 12, 2021 12:40:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-12T12:40:31.305Z: The workflow name is not a valid Cloud 
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring 
will be labeled with this modified job name: 
load0tests0java110dataflow0v20streaming0pardo01-jenkins-09-1wcm. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 12, 2021 12:40:37 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-12T12:40:35.921Z: Worker configuration: e2-standard-2 in 
us-central1-a.
Sep 12, 2021 12:40:37 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-12T12:40:36.780Z: Workflow failed. Causes: Project 
apache-beam-testing has insufficient quota(s) to execute this workflow with 1 
instances in region us-central1. Quota summary (required/available): 1/24379 
instances, 2/0 CPUs, 30/186406 disk GB, 0/2397 SSD disk GB, 1/280 instance 
groups, 1/283 managed instance groups, 1/509 instance templates, 1/608 in-use 
IP addresses.

Please see https://cloud.google.com/compute/docs/resource-quotas about 
requesting more quota.
Sep 12, 2021 12:40:37 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-12T12:40:36.841Z: Cleaning up.
Sep 12, 2021 12:40:37 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-12T12:40:36.915Z: Worker pool stopped.
Sep 12, 2021 12:40:39 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-12T12:40:38.093Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 12, 2021 12:40:42 PM org.apache.beam.runners.dataflow.DataflowPipelineJob 
logTerminalState
INFO: Job 2021-09-12_05_40_24-10635049069975718521 failed with status FAILED.
Sep 12, 2021 12:40:42 PM org.apache.beam.sdk.testutils.metrics.MetricsReader 
getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace pardo
Load test results for test (ID): 71b7da4c-3196-477b-acac-c336aa4a91e3 and 
timestamp: 2021-09-12T12:40:16.422000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                       0.0
dataflow_v2_java11_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: 
FAILED.
        at 
org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
        at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
        at 
org.apache.beam.sdk.loadtests.ParDoLoadTest.run(ParDoLoadTest.java:53)
        at 
org.apache.beam.sdk.loadtests.ParDoLoadTest.main(ParDoLoadTest.java:103)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210912123752
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2be8dd4d067015352e3d07e79fe1a9dca506491bd7e3291240a36d73403794d5
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210912123752]
- referencing digest: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2be8dd4d067015352e3d07e79fe1a9dca506491bd7e3291240a36d73403794d5]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210912123752] 
(referencing 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2be8dd4d067015352e3d07e79fe1a9dca506491bd7e3291240a36d73403794d5])].
Removing untagged image 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2be8dd4d067015352e3d07e79fe1a9dca506491bd7e3291240a36d73403794d5
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2be8dd4d067015352e3d07e79fe1a9dca506491bd7e3291240a36d73403794d5
Deleted 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2be8dd4d067015352e3d07e79fe1a9dca506491bd7e3291240a36d73403794d5].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 17s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/olwlsut5ue6xw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to