See 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/142/display/redirect?page=changes>

Changes:

[noreply] [adhoc] Speedup slow tests for AWS IO modules (#15899)

[noreply] [BEAM-12566] Implement set_axis for DataFrame and Series (#15773)

[noreply] [BEAM-13192] Fix buggy retry tests for AWS SnsIO (#15910)

[noreply] [BEAM-11217] Metrics Query filtering for DoFn metrics. (#15887)

[noreply] [Go SDK] Go SDK Exits Experimental (#15894)

[joseinigo] [BEAM-13080] Fix number of default keys

[joseinigo] [BEAM-13080] Fix number of default keys


------------------------------------------
[...truncated 48.60 KB...]
b54e21fc98b6: Preparing
80513f585ca8: Preparing
b306fc598105: Preparing
e1ea39e25414: Preparing
2c253a1ecb3d: Preparing
e823c04abb62: Preparing
f8506eb6d448: Preparing
66dbdac24a88: Preparing
78700b6b35d0: Preparing
62a5b8741e83: Preparing
36e0782f1159: Preparing
ba6e5ff31f23: Preparing
9f9f651e9303: Preparing
0b3c02b5d746: Preparing
62a747bf1719: Preparing
e823c04abb62: Waiting
2c253a1ecb3d: Waiting
e1ea39e25414: Waiting
f8506eb6d448: Waiting
80513f585ca8: Waiting
66dbdac24a88: Waiting
b306fc598105: Waiting
9f9f651e9303: Waiting
36e0782f1159: Waiting
78700b6b35d0: Waiting
0b3c02b5d746: Waiting
62a5b8741e83: Waiting
ba6e5ff31f23: Waiting
62a747bf1719: Waiting
b54e21fc98b6: Pushed
effff550acfb: Pushed
999d8bb81cd1: Pushed
79f8b53e4f15: Pushed
c966e6346529: Pushed
80513f585ca8: Pushed
e1ea39e25414: Pushed
2c253a1ecb3d: Pushed
78700b6b35d0: Layer already exists
62a5b8741e83: Layer already exists
36e0782f1159: Layer already exists
ba6e5ff31f23: Layer already exists
b306fc598105: Pushed
9f9f651e9303: Layer already exists
62a747bf1719: Layer already exists
0b3c02b5d746: Layer already exists
66dbdac24a88: Pushed
f8506eb6d448: Pushed
e823c04abb62: Pushed
20211106124333: digest: 
sha256:3767c712d2b32b7250bfe8e82e722d8cb59e058c2022992b6e73288bb2a27063 size: 
4311

> Task :sdks:java:testing:load-tests:run
Nov 06, 2021 12:45:32 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 06, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from 
the classpath: will stage 197 files. Enable logging at DEBUG level to see which 
files will be staged.
Nov 06, 2021 12:45:33 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 06, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
Nov 06, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Uploading 197 files from PipelineOptions.filesToStage to staging location 
to prepare for execution.
Nov 06, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 197 files cached, 0 files newly uploaded in 0 
seconds
Nov 06, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to 
gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 06, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <109683 bytes, hash 
e0f99d2369055f65369016f95c646e02ac6e57f4db3c0df472216533e8cdd873> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-4PmdI2kFX2U2kBb5XGRuAqxuV_TbPA30ciFlM-jN2HM.pb
Nov 06, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as 
step s1
Nov 06, 2021 12:45:38 PM 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: 
[org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ae62c7e, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2e869098, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37c36608, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d497a91, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@617389a, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1c8f6a90, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3050ac2f, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@265bd546, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1937eaff, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e0bc8a3, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b0f2299, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33063f5b, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15405bd6, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@352ed70d, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@70730db, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5793b87, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12704e15, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@512575e9, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f1a16fe, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2373ad99]
Nov 06, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 06, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 06, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 06, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as 
step s5
Nov 06, 2021 12:45:38 PM 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: 
[org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@285c6918, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@78a0ff63, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c601d50, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@79b2852b, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@326d27ac, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4d499d65, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@313f8301, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5cc9d3d0, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7c2dfa2, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@661d88a, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b0b64cc, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59ce792e, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4860827a, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@404db674, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@50f097b5, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7add838c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3662bdff, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1bb15351, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4fa822ad, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@597f0937]
Nov 06, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 06, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 06, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 06, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 06, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 06, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 06, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 06, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 06, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 06, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 06, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.35.0-SNAPSHOT
Nov 06, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-06_05_45_38-12108461136062105938?project=apache-beam-testing
Nov 06, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-11-06_05_45_38-12108461136062105938
Nov 06, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
> --region=us-central1 2021-11-06_05_45_38-12108461136062105938
Nov 06, 2021 12:45:46 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-11-06T12:45:45.142Z: The workflow name is not a valid Cloud 
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring 
will be labeled with this modified job name: 
load0tests0java110dataflow0v20streaming0cogbk01-jenkins-11-omxk. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 06, 2021 12:45:50 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:50.029Z: Worker configuration: e2-standard-2 in 
us-central1-a.
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:50.555Z: Expanding SplittableParDo operations into 
optimizable parts.
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:50.586Z: Expanding CollectionToSingleton operations into 
optimizable parts.
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:50.653Z: Expanding CoGroupByKey operations into 
optimizable parts.
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:50.728Z: Expanding SplittableProcessKeyed operations 
into optimizable parts.
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:50.753Z: Expanding GroupByKey operations into streaming 
Read/Write steps
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:50.821Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:50.914Z: Fusing adjacent ParDo, Read, Write, and Flatten 
operations
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:50.951Z: Unzipping flatten CoGroupByKey-Flatten for 
input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:50.983Z: Fusing unzipped copy of 
CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into 
producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.006Z: Fusing consumer CoGroupByKey/GBK/WriteStream 
into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.041Z: Fusing consumer Read 
input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read 
input/Impulse
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.067Z: Fusing consumer 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
 into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.098Z: Fusing consumer 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing
 into 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.120Z: Fusing consumer Read 
input/ParDo(StripIds)/ParMultiDo(StripIds) into 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.157Z: Fusing consumer Collect start time metrics 
(input)/ParMultiDo(TimeMonitor) into Read 
input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.192Z: Fusing consumer Window.Into()/Window.Assign 
into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.225Z: Fusing consumer 
CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into 
Window.Into()/Window.Assign
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.251Z: Fusing consumer Read 
co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read 
co-input/Impulse
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.285Z: Fusing consumer 
Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
 into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.314Z: Fusing consumer 
Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing
 into 
Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.348Z: Fusing consumer Read 
co-input/ParDo(StripIds)/ParMultiDo(StripIds) into 
Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.378Z: Fusing consumer Collect start time metrics 
(co-input)/ParMultiDo(TimeMonitor) into Read 
co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.399Z: Fusing consumer Window.Into()2/Window.Assign 
into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.435Z: Fusing consumer 
CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into 
Window.Into()2/Window.Assign
Nov 06, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.494Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets 
into CoGroupByKey/GBK/ReadStream
Nov 06, 2021 12:45:52 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.521Z: Fusing consumer 
CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into 
CoGroupByKey/GBK/MergeBuckets
Nov 06, 2021 12:45:52 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.550Z: Fusing consumer Ungroup and 
reiterate/ParMultiDo(UngroupAndReiterate) into 
CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 06, 2021 12:45:52 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.574Z: Fusing consumer Collect total 
bytes/ParMultiDo(ByteMonitor) into Ungroup and 
reiterate/ParMultiDo(UngroupAndReiterate)
Nov 06, 2021 12:45:52 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:51.608Z: Fusing consumer Collect end time 
metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 06, 2021 12:45:52 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:45:52.072Z: Starting 5 ****s in us-central1-a...
Nov 06, 2021 12:46:11 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:46:09.614Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 06, 2021 12:46:38 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:46:37.836Z: Autoscaling: Raised the number of ****s to 3 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 06, 2021 12:46:38 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:46:37.861Z: Resized **** pool to 3, though goal was 5.  
This could be a quota issue.
Nov 06, 2021 12:46:49 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:46:48.163Z: Autoscaling: Raised the number of ****s to 5 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 06, 2021 12:47:36 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:47:36.437Z: Workers have started successfully.
Nov 06, 2021 12:47:36 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T12:47:36.463Z: Workers have started successfully.
Nov 06, 2021 4:00:37 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T16:00:36.044Z: Cancel request is committed for workflow job: 
2021-11-06_05_45_38-12108461136062105938.
Nov 06, 2021 4:00:37 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T16:00:36.108Z: Cleaning up.
Nov 06, 2021 4:00:37 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T16:00:36.176Z: Stopping **** pool...
Nov 06, 2021 4:00:37 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T16:00:36.229Z: Stopping **** pool...
Nov 06, 2021 4:03:27 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T16:03:26.712Z: Autoscaling: Reduced the number of ****s to 0 
based on low average **** CPU utilization, and the pipeline having sufficiently 
low backlog and keeping up with input rate.
Nov 06, 2021 4:03:27 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-11-06T16:03:26.749Z: Worker pool stopped.
Nov 06, 2021 4:03:34 PM org.apache.beam.runners.dataflow.DataflowPipelineJob 
logTerminalState
INFO: Job 2021-11-06_05_45_38-12108461136062105938 finished with status 
CANCELLED.
Load test results for test (ID): fe22fbdc-a19f-469e-adaa-aac694f6e264 and 
timestamp: 2021-11-06T12:45:33.244000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11523.914
dataflow_v2_java11_total_bytes_count             2.67386971E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: 
CANCELLED.
        at 
org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51)
        at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139)
        at 
org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
        at 
org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211106124333
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3767c712d2b32b7250bfe8e82e722d8cb59e058c2022992b6e73288bb2a27063
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211106124333]
- referencing digest: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3767c712d2b32b7250bfe8e82e722d8cb59e058c2022992b6e73288bb2a27063]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211106124333] 
(referencing 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3767c712d2b32b7250bfe8e82e722d8cb59e058c2022992b6e73288bb2a27063])].
Removing untagged image 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3767c712d2b32b7250bfe8e82e722d8cb59e058c2022992b6e73288bb2a27063
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3767c712d2b32b7250bfe8e82e722d8cb59e058c2022992b6e73288bb2a27063
Deleted 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:3767c712d2b32b7250bfe8e82e722d8cb59e058c2022992b6e73288bb2a27063].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 20m 18s
101 actionable tasks: 71 executed, 28 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/pryongyp37moa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to