See 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java11/88/display/redirect>

Changes:


------------------------------------------
[...truncated 36.07 KB...]
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar FROM-CACHE
> Task :sdks:java:container:buildLinuxAmd64
> Task :sdks:java:container:goBuild
> Task :sdks:java:container:java11:copySdkHarnessLauncher
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:classes
> Task :sdks:java:testing:load-tests:compileJava FROM-CACHE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar
> Task :runners:google-cloud-dataflow-java:jar
> Task :sdks:java:container:java11:copyDockerfileDependencies
> Task :sdks:java:container:generateLicenseReport

> Task :sdks:java:container:pullLicenses
Copying already-fetched licenses from 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/reports/dependency-license>
 to 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/java_third_party_licenses>
Already using interpreter /usr/bin/python3
Using base prefix '/usr'
New python executable in 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/virtualenv/bin/python3>
Also creating executable in 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/virtualenv/bin/python>
Installing setuptools, pip, wheel...
done.
DEPRECATION: Python 3.5 reached the end of its life on September 13th, 2020. 
Please upgrade your Python as Python 3.5 is no longer maintained. pip 21.0 will 
drop support for Python 3.5 in January 2021. pip 21.0 will remove support for 
this functionality.
Collecting beautifulsoup4<5.0,>=4.9.0
  Using cached beautifulsoup4-4.10.0-py3-none-any.whl (97 kB)
Collecting future<1.0.0,>=0.16.0
  Using cached future-0.18.2-py3-none-any.whl
Collecting pyyaml<6.0.0,>=3.12
  Using cached PyYAML-5.3.1-cp35-cp35m-linux_x86_64.whl
Collecting tenacity<6.0,>=5.0.2
  Using cached tenacity-5.1.5-py2.py3-none-any.whl (34 kB)
Collecting soupsieve>1.2
  Using cached soupsieve-2.1-py3-none-any.whl (32 kB)
Collecting six>=1.9.0
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Installing collected packages: soupsieve, six, tenacity, pyyaml, future, 
beautifulsoup4
Successfully installed beautifulsoup4-4.10.0 future-0.18.2 pyyaml-5.3.1 
six-1.16.0 soupsieve-2.1 tenacity-5.1.5
Executing 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/virtualenv/bin/python>
 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py>
 
--license_index=<https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/reports/dependency-license/index.json>
        
--output_dir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/java_third_party_licenses>
        
--dep_url_yaml=<https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/dep_urls_java.yaml>
 
INFO:root:Pulling license for 207 dependencies using 16 threads.
INFO:root:pull_licenses_java.py succeed. It took 2.113717 seconds with 16 
threads.
Copying licenses from 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/java_third_party_licenses>
 to 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/third_party_licenses.>
Finished license_scripts.sh

> Task :sdks:java:container:java11:copyJavaThirdPartyLicenses

> Task :release:go-licenses:java:dockerRun
+ go-licenses save github.com/apache/beam/sdks/java/container 
--save_path=/output/licenses
+ go-licenses csv github.com/apache/beam/sdks/java/container
+ tee /output/licenses/list.csv
google.golang.org/protobuf,https://go.googlesource.com/protobuf/+/refs/heads/master/LICENSE,BSD-3-Clause
google.golang.org/genproto/googleapis/rpc/status,https://github.com/googleapis/go-genproto/blob/master/LICENSE,Apache-2.0
golang.org/x/text,https://go.googlesource.com/text/+/refs/heads/master/LICENSE,BSD-3-Clause
github.com/apache/beam/sdks/java/container,https://github.com/apache/beam/blob/master/LICENSE,Apache-2.0
google.golang.org/grpc,https://github.com/grpc/grpc-go/blob/master/LICENSE,Apache-2.0
golang.org/x/net,https://go.googlesource.com/net/+/refs/heads/master/LICENSE,BSD-3-Clause
github.com/golang/protobuf,https://github.com/golang/protobuf/blob/master/LICENSE,BSD-3-Clause
golang.org/x/sys,https://go.googlesource.com/sys/+/refs/heads/master/LICENSE,BSD-3-Clause
github.com/apache/beam/sdks/go/pkg/beam,https://github.com/apache/beam/blob/master/sdks/go/README.md,Apache-2.0
+ chmod -R a+w /output/licenses

> Task :release:go-licenses:java:createLicenses
> Task :sdks:java:container:java11:copyGolangLicenses
> Task :sdks:java:container:java11:dockerPrepare
> Task :sdks:java:container:java11:docker

> Task :runners:google-cloud-dataflow-java:buildAndPushDockerContainer
WARNING: `gcloud docker` will not be supported for Docker client versions above 
18.03.

As an alternative, use `gcloud auth configure-docker` to configure `docker` to
use `gcloud` as a credential helper, then use `docker` as you would for non-GCR
registries, e.g. `docker pull gcr.io/project-id/my-image`. Add
`--verbosity=error` to silence this warning: `gcloud docker
--verbosity=error -- pull gcr.io/project-id/my-image`.

See: 
https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker

The push refers to repository 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java]
1e1b4b515def: Preparing
f5dae4778b39: Preparing
ef403431f92d: Preparing
5d1b64738043: Preparing
f308cdb4647c: Preparing
37214ef2f760: Preparing
427ccfdee7ec: Preparing
1e6dc750b37d: Preparing
2a3d4d007c9d: Preparing
7da565d631cf: Preparing
dec0c23cfbc3: Preparing
5d5508584f9d: Preparing
3891808a925b: Preparing
d402f4f1b906: Preparing
00ef5416d927: Preparing
8555e663f65b: Preparing
d00da3cd7763: Preparing
4e61e63529c2: Preparing
799760671c38: Preparing
427ccfdee7ec: Waiting
1e6dc750b37d: Waiting
2a3d4d007c9d: Waiting
7da565d631cf: Waiting
37214ef2f760: Waiting
4e61e63529c2: Waiting
dec0c23cfbc3: Waiting
d00da3cd7763: Waiting
8555e663f65b: Waiting
799760671c38: Waiting
00ef5416d927: Waiting
5d5508584f9d: Waiting
3891808a925b: Waiting
f308cdb4647c: Pushed
f5dae4778b39: Pushed
ef403431f92d: Pushed
37214ef2f760: Pushed
1e1b4b515def: Pushed
5d1b64738043: Pushed
1e6dc750b37d: Pushed
2a3d4d007c9d: Pushed
3891808a925b: Layer already exists
dec0c23cfbc3: Pushed
00ef5416d927: Layer already exists
d402f4f1b906: Layer already exists
8555e663f65b: Layer already exists
d00da3cd7763: Layer already exists
5d5508584f9d: Pushed
427ccfdee7ec: Pushed
799760671c38: Layer already exists
4e61e63529c2: Layer already exists
7da565d631cf: Pushed
20210913120954: digest: 
sha256:03d6b8c61d86ae64b5ec4cdb1736ff20bbca6d246eec2cc81860e8ba70fedb17 size: 
4311

> Task :sdks:java:testing:load-tests:run
Sep 13, 2021 12:11:59 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 13, 2021 12:11:59 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from 
the classpath: will stage 195 files. Enable logging at DEBUG level to see which 
files will be staged.
Sep 13, 2021 12:12:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
Sep 13, 2021 12:12:02 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location 
to prepare for execution.
Sep 13, 2021 12:12:04 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 1 
seconds
Sep 13, 2021 12:12:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to 
gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 13, 2021 12:12:04 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <86835 bytes, hash 
fd98c1187983f97848b85439f8480b75bdbd89f0ba9651ac86c4c4ae61f0e446> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-_ZjBGHmD-XhIuFQ5-EgLdb29ifC6llGshsTErmHw5EY.pb
Sep 13, 2021 12:12:06 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as 
step s1
Sep 13, 2021 12:12:06 PM 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: 
[org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@68ab0936, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3cd9aa64, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@42b84286, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@443effcb, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@74ecacc3, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@517a2b0, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53b7ce6, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@36480b2d, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27d33393, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1f6917fb, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@41eb94bc, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@378cfecf, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@97d0c06, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e7c141d, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43af351a, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1305c126, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@72f9f27c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c1bdcc2, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@762637be, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b97c4ad]
Sep 13, 2021 12:12:06 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 13, 2021 12:12:06 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics as step s3
Sep 13, 2021 12:12:06 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Total bytes monitor as step s4
Sep 13, 2021 12:12:06 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s5
Sep 13, 2021 12:12:06 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Group by key (0) as step s6
Sep 13, 2021 12:12:06 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate (0) as step s7
Sep 13, 2021 12:12:06 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics (0) as step s8
Sep 13, 2021 12:12:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 13, 2021 12:12:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-13_05_12_06-13723217321559341727?project=apache-beam-testing
Sep 13, 2021 12:12:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-13_05_12_06-13723217321559341727
Sep 13, 2021 12:12:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
> --region=us-central1 2021-09-13_05_12_06-13723217321559341727
Sep 13, 2021 12:12:14 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-13T12:12:14.551Z: The workflow name is not a valid Cloud 
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring 
will be labeled with this modified job name: 
load0tests0java110dataflow0v20streaming0gbk01-jenkins-0913-8ibf. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 13, 2021 12:12:19 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-13T12:12:19.349Z: Worker configuration: e2-standard-2 in 
us-central1-a.
Sep 13, 2021 12:12:22 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-13T12:12:20.025Z: Workflow failed. Causes: Project 
apache-beam-testing has insufficient quota(s) to execute this workflow with 1 
instances in region us-central1. Quota summary (required/available): 1/24386 
instances, 2/0 CPUs, 30/183716 disk GB, 0/2397 SSD disk GB, 1/288 instance 
groups, 1/291 managed instance groups, 1/517 instance templates, 1/615 in-use 
IP addresses.

Please see https://cloud.google.com/compute/docs/resource-quotas about 
requesting more quota.
Sep 13, 2021 12:12:22 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-13T12:12:20.056Z: Cleaning up.
Sep 13, 2021 12:12:22 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-13T12:12:20.129Z: Worker pool stopped.
Sep 13, 2021 12:12:22 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-13T12:12:21.332Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 13, 2021 12:12:24 PM org.apache.beam.runners.dataflow.DataflowPipelineJob 
logTerminalState
INFO: Job 2021-09-13_05_12_06-13723217321559341727 failed with status FAILED.
Sep 13, 2021 12:12:24 PM org.apache.beam.sdk.testutils.metrics.MetricsReader 
getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace gbk
Load test results for test (ID): 5049c894-2211-4bc5-bdf0-1b15eaeba5e6 and 
timestamp: 2021-09-13T12:12:00.109000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                       0.0
dataflow_v2_java11_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: 
FAILED.
        at 
org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
        at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
        at 
org.apache.beam.sdk.loadtests.GroupByKeyLoadTest.run(GroupByKeyLoadTest.java:57)
        at 
org.apache.beam.sdk.loadtests.GroupByKeyLoadTest.main(GroupByKeyLoadTest.java:131)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210913120954
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:03d6b8c61d86ae64b5ec4cdb1736ff20bbca6d246eec2cc81860e8ba70fedb17
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210913120954]
- referencing digest: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:03d6b8c61d86ae64b5ec4cdb1736ff20bbca6d246eec2cc81860e8ba70fedb17]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210913120954] 
(referencing 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:03d6b8c61d86ae64b5ec4cdb1736ff20bbca6d246eec2cc81860e8ba70fedb17])].
Removing untagged image 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:03d6b8c61d86ae64b5ec4cdb1736ff20bbca6d246eec2cc81860e8ba70fedb17
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:03d6b8c61d86ae64b5ec4cdb1736ff20bbca6d246eec2cc81860e8ba70fedb17
Deleted 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:03d6b8c61d86ae64b5ec4cdb1736ff20bbca6d246eec2cc81860e8ba70fedb17].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 8s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/b4zctwesp7hgs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to