See 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_Streaming/636/display/redirect?page=changes>

Changes:

[ankurgoenka] Fix example syntax in SQL walkthough

[Udi Meiri] [BEAM-11688] Support partial proto encoding

[zyichi] [BEAM-11733] Skip flaky healthcare IO integration tests.

[Robert Bradshaw] Document recommended pipeline run pattern for Python.

[zyichi] [BEAM-11581] Minor fix to skip ExecutionStateSampler.reset() in

[noreply] Merge pull request #13723 from [BEAM-11736] Propagate pipeline options

[noreply] [BEAM-11715] Add ValidatesRunner test for translations.pack_combiners

[kileysok] Add tag to docker push

[noreply] [BEAM-11720] Don't assume a particular pip location, use default pip

[noreply] [BEAM-11589] Migrated settings.gradle file to Kotlin script. (#13837)

[noreply] Merge pull request #13824 from [BEAM-11700] Fix PortableRunner 
skipping


------------------------------------------
[...truncated 64.83 KB...]
> Task :model:fn-execution:extractProto UP-TO-DATE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :sdks:java:expansion-service:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:createCheckerFrameworkManifest UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:createCheckerFrameworkManifest 
> UP-TO-DATE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:google-cloud-dataflow-java:processResources UP-TO-DATE
> Task :model:fn-execution:processResources UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractProto UP-TO-DATE
> Task :model:job-management:processResources UP-TO-DATE
> Task 
> :runners:google-cloud-dataflow-java:****:windmill:createCheckerFrameworkManifest
>  UP-TO-DATE
> Task :sdks:java:io:kafka:createCheckerFrameworkManifest UP-TO-DATE
> Task 
> :runners:google-cloud-dataflow-java:****:legacy-****:createCheckerFrameworkManifest
>  UP-TO-DATE
> Task :sdks:java:io:kinesis:createCheckerFrameworkManifest UP-TO-DATE
> Task :sdks:java:io:synthetic:createCheckerFrameworkManifest UP-TO-DATE
> Task :sdks:java:testing:load-tests:createCheckerFrameworkManifest UP-TO-DATE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:createCheckerFrameworkManifest UP-TO-DATE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:processResources 
> NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:core:generateGrammarSource UP-TO-DATE
> Task :sdks:java:core:processResources UP-TO-DATE
> Task :model:pipeline:extractIncludeProto UP-TO-DATE
> Task :model:pipeline:extractProto UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:windmill:extractIncludeProto 
> UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:windmill:extractProto UP-TO-DATE
> Task :model:pipeline:generateProto UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:windmill:generateProto 
> UP-TO-DATE
> Task :model:pipeline:compileJava UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:windmill:compileJava UP-TO-DATE
> Task :model:pipeline:processResources UP-TO-DATE
> Task :model:pipeline:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:windmill:processResources 
> UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:windmill:classes UP-TO-DATE
> Task :model:pipeline:jar UP-TO-DATE
> Task :model:pipeline:shadowJar UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:windmill:shadowJar UP-TO-DATE
> Task :model:job-management:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :sdks:java:io:synthetic:compileJava UP-TO-DATE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE
> Task :sdks:java:io:synthetic:jar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar UP-TO-DATE
> Task :sdks:java:io:kinesis:compileJava UP-TO-DATE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:jar UP-TO-DATE
> Task :sdks:java:testing:test-utils:compileJava UP-TO-DATE
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar UP-TO-DATE
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :runners:java-fn-execution:compileJava UP-TO-DATE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar UP-TO-DATE
> Task :sdks:java:expansion-service:compileJava UP-TO-DATE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE
> Task :sdks:java:testing:load-tests:compileJava UP-TO-DATE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:compileJava UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:jar UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava 
> UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar UP-TO-DATE

> Task :sdks:java:testing:load-tests:run
Feb 03, 2021 12:46:56 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Feb 03, 2021 12:46:56 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from 
the classpath: will stage 187 files. Enable logging at DEBUG level to see which 
files will be staged.
Feb 03, 2021 12:46:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
Feb 03, 2021 12:46:59 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Uploading 188 files from PipelineOptions.filesToStage to staging location 
to prepare for execution.
Feb 03, 2021 12:46:59 PM 
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes 
forFileToStage
INFO: Staging custom dataflow-****.jar as 
beam-runners-google-cloud-dataflow-java-legacy-****-2.29.0-SNAPSHOT-hcrHl3nIcMxtK6uIC61YfEhyUatJnS8yaBhg51-cb3k.jar
Feb 03, 2021 12:47:00 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 188 files cached, 0 files newly uploaded in 0 
seconds
Feb 03, 2021 12:47:00 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as 
step s1
Feb 03, 2021 12:47:00 PM 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: 
[org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@54cf7c6a, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@78010562, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@50756c76, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@38aafb53, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1729ec00, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67f3d192, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1c9e07c6, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@73010765, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b10ace9, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@52169758, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3eda0aeb, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@459b187a, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6b4283c4, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@d0865a3, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@636bbbbb, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7eae3764, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@10dc7d6, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f668f29, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@716e431d, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e744f43]
Feb 03, 2021 12:47:00 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Feb 03, 2021 12:47:00 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics as step s3
Feb 03, 2021 12:47:00 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Total bytes monitor as step s4
Feb 03, 2021 12:47:00 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s5
Feb 03, 2021 12:47:00 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Group by key (0) as step s6
Feb 03, 2021 12:47:00 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate (0) as step s7
Feb 03, 2021 12:47:00 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics (0) as step s8
Feb 03, 2021 12:47:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to 
gs://temp-storage-for-perf-tests/loadtests/staging/
Feb 03, 2021 12:47:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.29.0-SNAPSHOT
Feb 03, 2021 12:47:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-03_04_47_00-9334879096971007819?project=apache-beam-testing
Feb 03, 2021 12:47:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-02-03_04_47_00-9334879096971007819
Feb 03, 2021 12:47:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
> --region=us-central1 2021-02-03_04_47_00-9334879096971007819
Feb 03, 2021 12:47:08 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-02-03T12:47:06.742Z: The workflow name is not a valid Cloud 
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring 
will be labeled with this modified job name: 
load0tests0java0dataflow0streaming0gbk02-jenkins-020312465-2no0. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
Feb 03, 2021 12:47:11 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T12:47:10.422Z: Worker configuration: n1-standard-4 in 
us-central1-f.
Feb 03, 2021 12:47:11 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T12:47:11.192Z: Expanding CoGroupByKey operations into 
optimizable parts.
Feb 03, 2021 12:47:12 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T12:47:11.401Z: Expanding SplittableProcessKeyed operations 
into optimizable parts.
Feb 03, 2021 12:47:12 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T12:47:11.428Z: Expanding GroupByKey operations into streaming 
Read/Write steps
Feb 03, 2021 12:47:12 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T12:47:11.487Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
Feb 03, 2021 12:47:12 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T12:47:11.630Z: Fusing adjacent ParDo, Read, Write, and Flatten 
operations
Feb 03, 2021 12:47:12 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T12:47:11.659Z: Fusing consumer Read input/StripIds into Read 
input/DataflowRunner.StreamingUnboundedRead.ReadWithIds
Feb 03, 2021 12:47:12 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T12:47:11.697Z: Fusing consumer Collect start time metrics into 
Read input/StripIds
Feb 03, 2021 12:47:12 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T12:47:11.732Z: Fusing consumer Total bytes monitor into 
Collect start time metrics
Feb 03, 2021 12:47:12 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T12:47:11.769Z: Fusing consumer Window.Into()/Window.Assign 
into Total bytes monitor
Feb 03, 2021 12:47:12 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T12:47:11.812Z: Fusing consumer Group by key (0)/WriteStream 
into Window.Into()/Window.Assign
Feb 03, 2021 12:47:12 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T12:47:11.837Z: Fusing consumer Group by key (0)/MergeBuckets 
into Group by key (0)/ReadStream
Feb 03, 2021 12:47:12 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T12:47:11.878Z: Fusing consumer Ungroup and reiterate (0) into 
Group by key (0)/MergeBuckets
Feb 03, 2021 12:47:12 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T12:47:11.924Z: Fusing consumer Collect end time metrics (0) 
into Ungroup and reiterate (0)
Feb 03, 2021 12:47:14 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T12:47:12.689Z: Executing operation Group by key 
(0)/ReadStream+Group by key (0)/MergeBuckets+Ungroup and reiterate (0)+Collect 
end time metrics (0)
Feb 03, 2021 12:47:14 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T12:47:12.753Z: Executing operation Read 
input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read 
input/StripIds+Collect start time metrics+Total bytes 
monitor+Window.Into()/Window.Assign+Group by key (0)/WriteStream
Feb 03, 2021 12:47:14 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T12:47:12.820Z: Starting 5 ****s in us-central1-f...
Feb 03, 2021 12:47:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T12:47:41.029Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Feb 03, 2021 12:47:53 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T12:47:52.066Z: Autoscaling: Raised the number of ****s to 5 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 03, 2021 12:48:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T12:48:20.393Z: Workers have started successfully.
Feb 03, 2021 12:48:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T12:48:20.423Z: Workers have started successfully.
Feb 03, 2021 4:00:34 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T16:00:34.205Z: Cancel request is committed for workflow job: 
2021-02-03_04_47_00-9334879096971007819.
Feb 03, 2021 4:00:34 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T16:00:34.249Z: Finished operation Group by key 
(0)/ReadStream+Group by key (0)/MergeBuckets+Ungroup and reiterate (0)+Collect 
end time metrics (0)
Feb 03, 2021 4:00:34 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T16:00:34.251Z: Finished operation Read 
input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read 
input/StripIds+Collect start time metrics+Total bytes 
monitor+Window.Into()/Window.Assign+Group by key (0)/WriteStream
Feb 03, 2021 4:00:34 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T16:00:34.372Z: Cleaning up.
Feb 03, 2021 4:00:34 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T16:00:34.591Z: Stopping **** pool...
Feb 03, 2021 4:01:30 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T16:01:28.655Z: Autoscaling: Reduced the number of ****s to 0 
based on low average **** CPU utilization, and the pipeline having sufficiently 
low backlog and keeping up with input rate.
Feb 03, 2021 4:01:30 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-03T16:01:28.894Z: Worker pool stopped.
Feb 03, 2021 4:01:37 PM org.apache.beam.runners.dataflow.DataflowPipelineJob 
logTerminalState
INFO: Job 2021-02-03_04_47_00-9334879096971007819 finished with status 
CANCELLED.
Load test results for test (ID): 8df1aeb5-e3ee-45dd-b502-11069937ad2f and 
timestamp: 2021-02-03T12:46:57.185000000Z:
                 Metric:                    Value:
    dataflow_runtime_sec                 11431.699
dataflow_total_bytes_count                1.999998E9
Exception in thread "main" java.lang.RuntimeException: Invalid job state: 
CANCELLED.
        at 
org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
        at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:137)
        at 
org.apache.beam.sdk.loadtests.GroupByKeyLoadTest.run(GroupByKeyLoadTest.java:57)
        at 
org.apache.beam.sdk.loadtests.GroupByKeyLoadTest.main(GroupByKeyLoadTest.java:131)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 14m 48s
90 actionable tasks: 1 executed, 89 up-to-date

Publishing build scan...
https://gradle.com/s/ltt2hxpx3ihle

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to