See 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/92/display/redirect?page=changes>

Changes:

[randomstep] [BEAM-12899] Upgrade Gradle to version 6.9.x

[noreply] [BEAM-12701] Added extra parameter in to_csv for DeferredFrame to name

[zyichi] [BEAM-12603] Add retries to FnApiRunnerTest due to flakiness of grpc

[noreply] [BEAM-12535] add dataframes notebook (#15470)


------------------------------------------
[...truncated 49.92 KB...]
ebd13132d2dd: Pushed
9998d23a62f8: Pushed
49750b7dcccd: Pushed
257de4377a3c: Pushed
3891808a925b: Layer already exists
d402f4f1b906: Layer already exists
00ef5416d927: Layer already exists
8555e663f65b: Layer already exists
d00da3cd7763: Layer already exists
4e61e63529c2: Layer already exists
8af1e1eec513: Pushed
cdf5761c3cd6: Pushed
799760671c38: Layer already exists
e625d5644106: Pushed
9a3f32baf9a8: Pushed
20210917124334: digest: 
sha256:040fcf1a34c1227a112a378852e86b7db64735e7d422f7bbf3e461a5345dd510 size: 
4311

> Task :sdks:java:testing:load-tests:run
Sep 17, 2021 12:45:22 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 17, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from 
the classpath: will stage 195 files. Enable logging at DEBUG level to see which 
files will be staged.
Sep 17, 2021 12:45:23 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Sep 17, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
Sep 17, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location 
to prepare for execution.
Sep 17, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 1 
seconds
Sep 17, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to 
gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 17, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <106569 bytes, hash 
68c7ddc9492113411748bb84e881c1fcc72a3845b1a74d3ca0b9986609d6b58a> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-aMfdyUkhE0EXSLuE6IHB_McqOEWxp008oLmYZgnWtYo.pb
Sep 17, 2021 12:45:28 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as 
step s1
Sep 17, 2021 12:45:28 PM 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: 
[org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d64160c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f254608, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2eeb0f9b, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b1c538d, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1645f294, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6325f352, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15c4af7a, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6cbd0674, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@55d58825, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@19a64eae, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29a98d9f, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da3b078, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@544e8149, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb66650, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a96d94c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b]
Sep 17, 2021 12:45:28 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 17, 2021 12:45:28 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Sep 17, 2021 12:45:28 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Sep 17, 2021 12:45:28 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as 
step s5
Sep 17, 2021 12:45:28 PM 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: 
[org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@41853299, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60d40ff4, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2e5b7fba, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27755487, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f0cab0a, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe7b6b0, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ab4ae59, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77681ce4, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d96bdf8, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f76c2cc, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@306f6f1d, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d7cac8, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fc6deb7, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@367f0121, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7da39774, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b]
Sep 17, 2021 12:45:28 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Sep 17, 2021 12:45:28 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Sep 17, 2021 12:45:28 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Sep 17, 2021 12:45:28 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Sep 17, 2021 12:45:28 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Sep 17, 2021 12:45:28 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Sep 17, 2021 12:45:28 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Sep 17, 2021 12:45:28 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Sep 17, 2021 12:45:28 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Sep 17, 2021 12:45:28 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Sep 17, 2021 12:45:28 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Sep 17, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 17, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-17_05_45_28-17019632363355082856?project=apache-beam-testing
Sep 17, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-17_05_45_28-17019632363355082856
Sep 17, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
> --region=us-central1 2021-09-17_05_45_28-17019632363355082856
Sep 17, 2021 12:45:37 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-17T12:45:36.321Z: The workflow name is not a valid Cloud 
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring 
will be labeled with this modified job name: 
load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-4adb. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:41.847Z: Worker configuration: e2-standard-2 in 
us-central1-a.
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:42.587Z: Expanding SplittableParDo operations into 
optimizable parts.
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:42.623Z: Expanding CollectionToSingleton operations into 
optimizable parts.
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:42.696Z: Expanding CoGroupByKey operations into 
optimizable parts.
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:42.767Z: Expanding SplittableProcessKeyed operations 
into optimizable parts.
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:42.800Z: Expanding GroupByKey operations into streaming 
Read/Write steps
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:42.867Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:42.983Z: Fusing adjacent ParDo, Read, Write, and Flatten 
operations
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.023Z: Unzipping flatten CoGroupByKey-Flatten for 
input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.055Z: Fusing unzipped copy of 
CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into 
producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.083Z: Fusing consumer CoGroupByKey/GBK/WriteStream 
into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.107Z: Fusing consumer Read 
input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read 
input/Impulse
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.138Z: Fusing consumer 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
 into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.194Z: Fusing consumer 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing
 into 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.227Z: Fusing consumer Read 
input/ParDo(StripIds)/ParMultiDo(StripIds) into 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.261Z: Fusing consumer Collect start time metrics 
(input)/ParMultiDo(TimeMonitor) into Read 
input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.294Z: Fusing consumer Window.Into()/Window.Assign 
into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.327Z: Fusing consumer 
CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into 
Window.Into()/Window.Assign
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.382Z: Fusing consumer Read 
co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read 
co-input/Impulse
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.419Z: Fusing consumer 
Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
 into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.453Z: Fusing consumer 
Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing
 into 
Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.487Z: Fusing consumer Read 
co-input/ParDo(StripIds)/ParMultiDo(StripIds) into 
Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.520Z: Fusing consumer Collect start time metrics 
(co-input)/ParMultiDo(TimeMonitor) into Read 
co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.559Z: Fusing consumer Window.Into()2/Window.Assign 
into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.592Z: Fusing consumer 
CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into 
Window.Into()2/Window.Assign
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.626Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets 
into CoGroupByKey/GBK/ReadStream
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.651Z: Fusing consumer 
CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into 
CoGroupByKey/GBK/MergeBuckets
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.682Z: Fusing consumer Ungroup and 
reiterate/ParMultiDo(UngroupAndReiterate) into 
CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.719Z: Fusing consumer Collect total 
bytes/ParMultiDo(ByteMonitor) into Ungroup and 
reiterate/ParMultiDo(UngroupAndReiterate)
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:43.749Z: Fusing consumer Collect end time 
metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Sep 17, 2021 12:45:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:44.065Z: Starting 5 ****s in us-central1-a...
Sep 17, 2021 12:45:59 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:45:56.770Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 17, 2021 12:46:24 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:46:23.992Z: Autoscaling: Raised the number of ****s to 5 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Sep 17, 2021 12:47:24 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:47:23.700Z: Workers have started successfully.
Sep 17, 2021 12:47:24 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T12:47:23.730Z: Workers have started successfully.
Sep 17, 2021 4:00:29 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T16:00:27.591Z: Cancel request is committed for workflow job: 
2021-09-17_05_45_28-17019632363355082856.
Sep 17, 2021 4:00:29 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T16:00:27.669Z: Cleaning up.
Sep 17, 2021 4:00:29 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T16:00:27.733Z: Stopping **** pool...
Sep 17, 2021 4:00:29 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T16:00:27.783Z: Stopping **** pool...
Sep 17, 2021 4:00:38 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-17T16:00:36.078Z: generic::internal: The work item requesting 
state read is no longer valid on the backend. The work has already completed or 
will be retried. This is expected during autoscaling events.
passed through:
==>
    
dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:1089
generic::internal: The work item requesting state read is no longer valid on 
the backend. The work has already completed or will be retried. This is 
expected during autoscaling events.
passed through:
==>
    
dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:1089
Sep 17, 2021 4:02:48 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T16:02:47.614Z: Autoscaling: Reduced the number of ****s to 0 
based on low average **** CPU utilization, and the pipeline having sufficiently 
low backlog and keeping up with input rate.
Sep 17, 2021 4:02:48 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-17T16:02:47.650Z: Worker pool stopped.
Sep 17, 2021 4:02:53 PM org.apache.beam.runners.dataflow.DataflowPipelineJob 
logTerminalState
INFO: Job 2021-09-17_05_45_28-17019632363355082856 finished with status 
CANCELLED.
Load test results for test (ID): b0ec15fe-b275-441d-9c56-55e080f6e2c1 and 
timestamp: 2021-09-17T12:45:22.952000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11548.337
dataflow_v2_java11_total_bytes_count             2.04755766E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: 
CANCELLED.
        at 
org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
        at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
        at 
org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
        at 
org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210917124334
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:040fcf1a34c1227a112a378852e86b7db64735e7d422f7bbf3e461a5345dd510
Deleted: sha256:51220a157e98970e581fc82c17ae080a7d75c198d4bd5dec5c6ff4c8c0110d2b
Deleted: sha256:1d1b931ea7dd3188599d30a0b00df141ada7a7a4a5ccf60c4e7aa7e60618b2c1
Deleted: sha256:5dbac7b2b241cce8f57417a7d9e6f39a8a6c8fa4d2d1d826f4f672a1ddd49f8f
Deleted: sha256:42a17ddefe018c5adf4da9391b84c1dd878323bb26c6cd1ce078c800506e988a
Deleted: sha256:b7db880cee92f9f53fe649bc6435667f285c74ac9f09897f045207a531478fd9
Deleted: sha256:fb08917ee3996cc5113be9f90a75c20b2eb236bbd16c3c3392f0384eba179726
Deleted: sha256:ded0f8bcd7e15dcb58d3b9f1e8f7d42efbfc06a5cfc6af904d9ec75981f1ecf2
Deleted: sha256:7628ace029b201d217f843a245a0a469e8009f9478092381a5b31efd99ea02be
Deleted: sha256:db83a98067892f629c0dc154a4a11a4d07b95f5546030efad558b78f4a8c34f7
Deleted: sha256:e6cbf762df70e1132a2ae92700d379b555096e58c03da72b5dcc4403e34dcfee
Deleted: sha256:adea99ad81c3fec89cbda76e18d733d27ed8475a65c292d5149d5255d4df37a3
Deleted: sha256:c54d27b5b88135d2c48b53071e2fffab92504a37258eba4f7070888d1a6e4a12
Deleted: sha256:aabf746fb1f34f31d269d68d4397b4fa1499c3a58cfba5815caf6b77852beac8
Deleted: sha256:548b9b1151932c97ed1c2e7b3ecf31165a2f8f6f9a97e44c898e97d1b30bdda3
Deleted: sha256:0bc79cf7c6bd3e9413c107bd1a89eaf0fa06f837792e422e0104ace9758f4e13
Deleted: sha256:66ec577042f379f6cab0f3aba2c14996fb4d892202db514b46f49f669b37828f
Deleted: sha256:9e555a5ad620739800a5be935c9bac8a0fdbbbb9f3c85a81751e64a4478c9432
Deleted: sha256:7a249063b8af977ec773a8ac90ff3fc936007d5a062f301532249dc8929b4625
Deleted: sha256:d2a7bcf00ba40eba94e52a90902d7f5dea6413542bd5883f09fb0503ae227849
Deleted: sha256:35401cb89023ae465a3ed458da089982ec2ada8929aaa9eab7b5c0774406fe97
Deleted: sha256:f5e4f0993712ead6150fe6808de2216017732a6313eae29b966dfdec41128c26
Deleted: sha256:40d95b9e1a99f529776a937184a32d7ccaec26e3c236f9072694e9492d966b46
Deleted: sha256:31db754218e7983d9d30dbf3af0b78c3866904296ace832c41e9e6b238391c34
Deleted: sha256:5e98107d8799b66fdc959243e6051aaedd58bf8fd3a2fa32f528d89608c28912
Deleted: sha256:596eff7aac946664bc43bbcacf63fa5c0759211999a5ec34c9d76a29b340d19f
Deleted: sha256:4485b5c195cb622147812aa99f66a6c759e5358c9e458836d012ac296aa44c96
Deleted: sha256:30fcfe67a4e49646910b554d343c76bd336257c8ae792f41978ea002f32e9ec9
Deleted: sha256:2a61b929e2767dac14ba8a95aa1a8c68916cdd4694f2166f73e207e7b6ecd516
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210917124334]
- referencing digest: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:040fcf1a34c1227a112a378852e86b7db64735e7d422f7bbf3e461a5345dd510]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210917124334] 
(referencing 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:040fcf1a34c1227a112a378852e86b7db64735e7d422f7bbf3e461a5345dd510])].
Removing untagged image 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:040fcf1a34c1227a112a378852e86b7db64735e7d422f7bbf3e461a5345dd510
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:040fcf1a34c1227a112a378852e86b7db64735e7d422f7bbf3e461a5345dd510
Deleted 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:040fcf1a34c1227a112a378852e86b7db64735e7d422f7bbf3e461a5345dd510].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 38s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/sqswlj37yu3za

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to