See 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/49/display/redirect?page=changes>

Changes:

[Etienne Chauchot] [BEAM-12591] Put Spark Structured Streaming runner sources 
back to main

[Etienne Chauchot] [BEAM-12629] As spark DataSourceV2 is only available for 
spark 2,

[Etienne Chauchot] [BEAM-12627] Deal with spark Encoders braking change between 
spark 2 and

[Etienne Chauchot] [BEAM-12591] move SchemaHelpers to correct package

[Etienne Chauchot] [BEAM-8470] Disable wait for termination in a streaming 
pipeline because

[Etienne Chauchot] [BEAM-12630] Deal with breaking change in streaming 
pipelines start by

[Etienne Chauchot] [BEAM-12629] Make source tests spark version agnostic and 
move them back

[Etienne Chauchot] [BEAM-12629] Make a spark 3 source impl

[ajamato] [BEAM-12670] Relocate bq client exception imports to try block and

[Etienne Chauchot] [BEAM-12591] Fix checkstyle and spotless

[Etienne Chauchot] [BEAM-12629] Reduce serializable to only needed classes and 
Fix schema

[Etienne Chauchot] [BEAM-12591] Add checkstyle exceptions for version specific 
classes

[Etienne Chauchot] [BEAM-12629] Fix sources javadocs and improve impl

[dpires] [BEAM-12715] Use shard number specified by user in SnowflakeIO batch

[Etienne Chauchot] [BEAM-12591] Add spark 3 to structured streaming validates 
runner tests

[relax] fix GroupIntoBatches

[noreply] [BEAM-12703] Fix universal metrics. (#15260)

[noreply] [BEAM-12702] Pull step unique names from pipeline for metrics. 
(#15261)

[noreply] [BEAM-12678] Add dependency of java jars when running go VR on 
portable

[noreply] [BEAM-12671] Mark known composite transforms native (#15236)

[noreply] [BEAM-6516] Fixes race condition in RabbitMqIO causing duplicate acks


------------------------------------------
[...truncated 66.00 KB...]
105cc467921f: Pushed
d7d869eafd69: Pushed
e613c09693fe: Pushed
f42aed5f7feb: Layer already exists
89819bafde36: Layer already exists
f3d5b8f65132: Layer already exists
f90211cbf29a: Pushed
ad83f0aa5c0a: Layer already exists
4b0edb23340c: Layer already exists
5a9a65095453: Layer already exists
afa3e488a0ee: Layer already exists
bafc2b387421: Pushed
52c82cddf467: Pushed
32249bc1f186: Pushed
20210805124354: digest: 
sha256:4b4f7c2c2a48604c17b7891a918ab489f87579c330b10965aa7eb7bc63de073c size: 
4310

> Task :sdks:java:testing:load-tests:run
Aug 05, 2021 12:51:31 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Aug 05, 2021 12:51:32 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from 
the classpath: will stage 191 files. Enable logging at DEBUG level to see which 
files will be staged.
Aug 05, 2021 12:51:33 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Aug 05, 2021 12:51:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
Aug 05, 2021 12:51:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to 
gs://temp-storage-for-perf-tests/loadtests/staging/
Aug 05, 2021 12:51:36 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <112699 bytes, hash 
555293d1bbf87ed37bccdb58e5ce9e12073aaa40ce37a0ca450e91ceb040c25a> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-VVKT0bv4ftN7zNtY5c6eEgc6qkDON6DKRQ6RzrBAwlo.pb
Aug 05, 2021 12:51:38 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Uploading 191 files from PipelineOptions.filesToStage to staging location 
to prepare for execution.
Aug 05, 2021 12:51:40 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 191 files cached, 0 files newly uploaded in 1 
seconds
Aug 05, 2021 12:51:40 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as 
step s1
Aug 05, 2021 12:51:40 PM 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: 
[org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c3c4a71, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1352434e, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f9a6c2d, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b6fcb9f, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@75de6341, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@74170687, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@68f0f72c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3d96fa9e, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3b545206, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77bb48d5, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@181d8899, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12d5c30e, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b887730, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26586b74, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@52f57666, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e041285, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@267dc982, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@439b15f2, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3aa41da1, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@74fab04a]
Aug 05, 2021 12:51:40 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Aug 05, 2021 12:51:40 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Aug 05, 2021 12:51:40 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Aug 05, 2021 12:51:40 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as 
step s5
Aug 05, 2021 12:51:40 PM 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: 
[org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44ed0a8f, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32177fa5, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a96d56c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ab4a5b, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2abe9173, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@235d29d6, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1fdca564, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43f9dd56, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d12e953, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57cb70be, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d4608a6, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@20d87335, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a8a4e0c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26c89563, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3bd6ba24, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58f437b0, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@20f6f88c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4277127c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c7e978c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@354e7004]
Aug 05, 2021 12:51:40 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Aug 05, 2021 12:51:40 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Aug 05, 2021 12:51:40 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Aug 05, 2021 12:51:40 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Aug 05, 2021 12:51:40 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Aug 05, 2021 12:51:40 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Aug 05, 2021 12:51:40 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Aug 05, 2021 12:51:40 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Aug 05, 2021 12:51:40 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Aug 05, 2021 12:51:40 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Aug 05, 2021 12:51:40 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Aug 05, 2021 12:51:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
Aug 05, 2021 12:51:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-05_05_51_40-12932913610290942808?project=apache-beam-testing
Aug 05, 2021 12:51:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-08-05_05_51_40-12932913610290942808
Aug 05, 2021 12:51:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
> --region=us-central1 2021-08-05_05_51_40-12932913610290942808
Aug 05, 2021 12:51:49 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-08-05T12:51:48.483Z: The workflow name is not a valid Cloud 
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring 
will be labeled with this modified job name: 
load0tests0java110dataflow0v20streaming0cogbk01-jenkins-08-iiph. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:53.565Z: Worker configuration: e2-standard-2 in 
us-central1-a.
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:54.112Z: Expanding SplittableParDo operations into 
optimizable parts.
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:54.136Z: Expanding CollectionToSingleton operations into 
optimizable parts.
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:54.210Z: Expanding CoGroupByKey operations into 
optimizable parts.
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:54.298Z: Expanding SplittableProcessKeyed operations 
into optimizable parts.
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:54.334Z: Expanding GroupByKey operations into streaming 
Read/Write steps
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:54.399Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:54.543Z: Fusing adjacent ParDo, Read, Write, and Flatten 
operations
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:54.578Z: Unzipping flatten CoGroupByKey-Flatten for 
input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:54.602Z: Fusing unzipped copy of 
CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into 
producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:54.631Z: Fusing consumer CoGroupByKey/GBK/WriteStream 
into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:54.700Z: Fusing consumer Read 
input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read 
input/Impulse
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:54.735Z: Fusing consumer 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
 into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:54.767Z: Fusing consumer 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing
 into 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:54.808Z: Fusing consumer Read 
input/ParDo(StripIds)/ParMultiDo(StripIds) into 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:54.832Z: Fusing consumer Collect start time metrics 
(input)/ParMultiDo(TimeMonitor) into Read 
input/ParDo(StripIds)/ParMultiDo(StripIds)
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:54.874Z: Fusing consumer Window.Into()/Window.Assign 
into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:54.909Z: Fusing consumer 
CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into 
Window.Into()/Window.Assign
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:54.942Z: Fusing consumer Read 
co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read 
co-input/Impulse
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:54.970Z: Fusing consumer 
Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
 into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:54.993Z: Fusing consumer 
Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing
 into 
Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:55.022Z: Fusing consumer Read 
co-input/ParDo(StripIds)/ParMultiDo(StripIds) into 
Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:55.043Z: Fusing consumer Collect start time metrics 
(co-input)/ParMultiDo(TimeMonitor) into Read 
co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:55.076Z: Fusing consumer Window.Into()2/Window.Assign 
into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Aug 05, 2021 12:51:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:55.111Z: Fusing consumer 
CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into 
Window.Into()2/Window.Assign
Aug 05, 2021 12:51:57 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:55.145Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets 
into CoGroupByKey/GBK/ReadStream
Aug 05, 2021 12:51:57 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:55.178Z: Fusing consumer 
CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into 
CoGroupByKey/GBK/MergeBuckets
Aug 05, 2021 12:51:57 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:55.206Z: Fusing consumer Ungroup and 
reiterate/ParMultiDo(UngroupAndReiterate) into 
CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Aug 05, 2021 12:51:57 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:55.233Z: Fusing consumer Collect total 
bytes/ParMultiDo(ByteMonitor) into Ungroup and 
reiterate/ParMultiDo(UngroupAndReiterate)
Aug 05, 2021 12:51:57 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:55.265Z: Fusing consumer Collect end time 
metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Aug 05, 2021 12:51:57 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:51:55.641Z: Starting 5 ****s in us-central1-a...
Aug 05, 2021 12:52:22 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:52:22.932Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Aug 05, 2021 12:52:26 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:52:25.368Z: Autoscaling: Raised the number of ****s to 2 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Aug 05, 2021 12:52:26 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:52:25.402Z: Resized **** pool to 2, though goal was 5.  
This could be a quota issue.
Aug 05, 2021 12:52:36 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:52:35.681Z: Autoscaling: Raised the number of ****s to 5 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Aug 05, 2021 12:53:28 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:53:27.100Z: Workers have started successfully.
Aug 05, 2021 12:53:28 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T12:53:27.133Z: Workers have started successfully.
Aug 05, 2021 4:00:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T16:00:40.807Z: Cancel request is committed for workflow job: 
2021-08-05_05_51_40-12932913610290942808.
Aug 05, 2021 4:00:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T16:00:40.905Z: Cleaning up.
Aug 05, 2021 4:00:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T16:00:40.987Z: Stopping **** pool...
Aug 05, 2021 4:00:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T16:00:41.033Z: Stopping **** pool...
Aug 05, 2021 4:03:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T16:03:13.474Z: Autoscaling: Reduced the number of ****s to 0 
based on low average **** CPU utilization, and the pipeline having sufficiently 
low backlog and keeping up with input rate.
Aug 05, 2021 4:03:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-05T16:03:13.508Z: Worker pool stopped.
Aug 05, 2021 4:03:20 PM org.apache.beam.runners.dataflow.DataflowPipelineJob 
logTerminalState
INFO: Job 2021-08-05_05_51_40-12932913610290942808 finished with status 
CANCELLED.
Aug 05, 2021 4:03:20 PM org.apache.beam.sdk.testutils.metrics.MetricsReader 
getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace co_gbk
Load test results for test (ID): d406f90f-42fe-4291-a9ec-1c64412dd987 and 
timestamp: 2021-08-05T12:51:32.497000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                       0.0
dataflow_v2_java11_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: 
CANCELLED.
        at 
org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
        at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
        at 
org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
        at 
org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210805124354
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4b4f7c2c2a48604c17b7891a918ab489f87579c330b10965aa7eb7bc63de073c
Deleted: sha256:ea0b7bf170f0af89f9bad837c551cd70ceac10305742c0242ff51a8a4876620b
Deleted: sha256:d1d6b7634994eb58e110930c0ecb35692daee7e4e864d7897a2a309632a62f3c
Deleted: sha256:14788254f7d10833d2e668f4b7b906c261fc7e534f9e1d4c9707fcd91185427f
Deleted: sha256:4c21031e25abbcfb7b6ccb841ba3f0843f359d28616b14f2e1676425af8d8bf8
Deleted: sha256:018f640d4288400e0827071abf200cb6d587298ca4a121a9e4512765eb71118b
Deleted: sha256:aadf4dfbec4a2a9fb38aaea7057d59f1a16682c98ce6af90088dd6a0aa6d1922
Deleted: sha256:51842ae6b83a47bf3100d232a9ba2763ca02089935b3cf250e8596072c8f501f
Deleted: sha256:975bf50e7fbedde807e37616eae062a68df9ab54c8d3d6147d90bef11925a8f4
Deleted: sha256:09caa155b057f85b509b879f14d0ea7c1e485ebdd9354eb74e57089b8a7566e0
Deleted: sha256:bb0d8b9e315694f241f79e54a33d9d83d2cf188f3aebe15d3200b9ca1c367c8e
Deleted: sha256:dcdde4da7bc7e5e49b4edd1a32a1990fff053629f47cc78b1d77eb1587344653
Deleted: sha256:c8ec9f92cee2ef63ffcb29202d6de50879f0c87d33a6c0ee6ca843ec1a3161da
Deleted: sha256:a1849371c370c4344f2e41fba6acc2e96d1d71f9bf24646d42fbcbdb74d55fbe
Deleted: sha256:3e2492d9515fffeefe02e7acc076bf56653bc7913704e2617fc5ca29a3c0cd14
Deleted: sha256:3b3fe95b54a404956ec23b3ef2d683a91b94ab67c7c579344b17f06ac0647bb0
Deleted: sha256:eb02cd9ead098b820cd746510d1a9596880af1210be555ee638c322644167122
Deleted: sha256:d2c6d056df3cb875caaedf879139d5fbbbabf46292568208158e82f55357825c
Deleted: sha256:7eb3f670c072a198399e5296bbc1805c71051d5f72603f596adf994b2b909cb3
Deleted: sha256:db77e4a3a3c9cfa47e1a0f168d977e11de2f97e3aa3a73794b11c8c1ec5e0c21
Deleted: sha256:7e48d05b522502f3085b4bed9be889f6291bc368f737c54c1c59357798ab6037
Deleted: sha256:21a2ac4698bc338e3b71cf690707a042ae045d844afc48a5095fffc9cb3114f0
Deleted: sha256:4a1d2eb3134109d2f4f820401cd8187b754d572b64ca6bf311689ceb3e43ae1e
Deleted: sha256:264f03ba9f489cbae86e3b76cf36f8920950634f124c2fd98933ce06773f8fb2
Deleted: sha256:b3e47cd8aec70ce65817793014bc76d8501f1e58704264e4e4862aee4205cdcd
Deleted: sha256:2c9cda29a87415606cd18851ef9e65171f93d958cde49a75f056b16bcdcaf3ac
Deleted: sha256:20742cee675c6890f2fa7bdd78b03839a70ba4596efb5650e2e501d9a3be17da
Deleted: sha256:a2664dd8a165f32220a04ea5156da94722bfd8b466de858caa7bd63e0467a0f3
Deleted: sha256:a173e282e735d93d3fab3276e1d0d5d0eafa62e21c8ca2f59d2118c6e956fc20
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210805124354]
- referencing digest: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4b4f7c2c2a48604c17b7891a918ab489f87579c330b10965aa7eb7bc63de073c]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210805124354] 
(referencing 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4b4f7c2c2a48604c17b7891a918ab489f87579c330b10965aa7eb7bc63de073c])].
Removing untagged image 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4b4f7c2c2a48604c17b7891a918ab489f87579c330b10965aa7eb7bc63de073c
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4b4f7c2c2a48604c17b7891a918ab489f87579c330b10965aa7eb7bc63de073c
Deleted 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4b4f7c2c2a48604c17b7891a918ab489f87579c330b10965aa7eb7bc63de073c].
Removing untagged image 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:fa3174bce374b6fc640d79bcebecc861796e8b3f403885041ff95aee6be4769c
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:fa3174bce374b6fc640d79bcebecc861796e8b3f403885041ff95aee6be4769c
Deleted 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:fa3174bce374b6fc640d79bcebecc861796e8b3f403885041ff95aee6be4769c].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 59s
107 actionable tasks: 76 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/unaiwizv6756w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to