See 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/97/display/redirect?page=changes>

Changes:

[noreply] [BEAM-12919] Removed IBM Streams from runner matrix (#15542)

[danthev] Fix 2.32.0 release notes.

[noreply] [BEAM-12258] Re-throw exception from forked thread in

[kawaigin] [BEAM-10708] Added an example notebook for beam_sql magic

[noreply] Add a timeout for BQ streaming_insert RPCS (#15541)

[noreply] Merge pull request #15537 from [BEAM-12908] Add a sleep to the IT 
after


------------------------------------------
[...truncated 49.31 KB...]
b0c95b2954a3: Pushed
75862a2f912c: Pushed
3c77729b931b: Pushed
e0858b5565a7: Pushed
c23774ca1987: Pushed
3891808a925b: Layer already exists
d402f4f1b906: Layer already exists
00ef5416d927: Layer already exists
8555e663f65b: Layer already exists
d00da3cd7763: Layer already exists
4e61e63529c2: Layer already exists
b63bcb3be20b: Pushed
799760671c38: Layer already exists
5d294ac669be: Pushed
c036ede0a712: Pushed
24d403efe7b4: Pushed
20210922124335: digest: 
sha256:eff74314ab303743ab8314e947d1999c5f0b633378bc0789128ffce6bf27b20a size: 
4311

> Task :sdks:java:testing:load-tests:run
Sep 22, 2021 12:45:33 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 22, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from 
the classpath: will stage 195 files. Enable logging at DEBUG level to see which 
files will be staged.
Sep 22, 2021 12:45:34 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Sep 22, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
Sep 22, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location 
to prepare for execution.
Sep 22, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 
seconds
Sep 22, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to 
gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 22, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <106569 bytes, hash 
fbdd9ba961cf8321bc47225252568573fc4a4cb0e7fb71b2ccd623e964808a11> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline--92bqWHPgyG8RyJSUlaFc_xKTLDn-3GyzNYj6WSAihE.pb
Sep 22, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as 
step s1
Sep 22, 2021 12:45:38 PM 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: 
[org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2eeb0f9b, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b1c538d, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1645f294, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6325f352, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15c4af7a, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6cbd0674, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@55d58825, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@19a64eae, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29a98d9f, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da3b078, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@544e8149, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb66650, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a96d94c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef]
Sep 22, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 22, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Sep 22, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Sep 22, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as 
step s5
Sep 22, 2021 12:45:38 PM 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: 
[org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2e5b7fba, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27755487, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f0cab0a, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe7b6b0, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ab4ae59, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77681ce4, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d96bdf8, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f76c2cc, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@306f6f1d, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d7cac8, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fc6deb7, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@367f0121, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7da39774, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713]
Sep 22, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Sep 22, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Sep 22, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Sep 22, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Sep 22, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Sep 22, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Sep 22, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Sep 22, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Sep 22, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Sep 22, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Sep 22, 2021 12:45:38 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Sep 22, 2021 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 22, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-22_05_45_38-8630686692093343177?project=apache-beam-testing
Sep 22, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-22_05_45_38-8630686692093343177
Sep 22, 2021 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
> --region=us-central1 2021-09-22_05_45_38-8630686692093343177
Sep 22, 2021 12:45:46 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-22T12:45:46.517Z: The workflow name is not a valid Cloud 
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring 
will be labeled with this modified job name: 
load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-i48w. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 22, 2021 12:45:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:50.707Z: Worker configuration: e2-standard-2 in 
us-central1-a.
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:51.572Z: Expanding SplittableParDo operations into 
optimizable parts.
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:51.608Z: Expanding CollectionToSingleton operations into 
optimizable parts.
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:51.673Z: Expanding CoGroupByKey operations into 
optimizable parts.
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:51.753Z: Expanding SplittableProcessKeyed operations 
into optimizable parts.
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:51.793Z: Expanding GroupByKey operations into streaming 
Read/Write steps
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:51.861Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:51.956Z: Fusing adjacent ParDo, Read, Write, and Flatten 
operations
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:51.989Z: Unzipping flatten CoGroupByKey-Flatten for 
input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.026Z: Fusing unzipped copy of 
CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into 
producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.061Z: Fusing consumer CoGroupByKey/GBK/WriteStream 
into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.091Z: Fusing consumer Read 
input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read 
input/Impulse
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.124Z: Fusing consumer 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
 into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.156Z: Fusing consumer 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing
 into 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.189Z: Fusing consumer Read 
input/ParDo(StripIds)/ParMultiDo(StripIds) into 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.226Z: Fusing consumer Collect start time metrics 
(input)/ParMultiDo(TimeMonitor) into Read 
input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.262Z: Fusing consumer Window.Into()/Window.Assign 
into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.295Z: Fusing consumer 
CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into 
Window.Into()/Window.Assign
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.326Z: Fusing consumer Read 
co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read 
co-input/Impulse
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.354Z: Fusing consumer 
Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
 into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.392Z: Fusing consumer 
Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing
 into 
Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.429Z: Fusing consumer Read 
co-input/ParDo(StripIds)/ParMultiDo(StripIds) into 
Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.464Z: Fusing consumer Collect start time metrics 
(co-input)/ParMultiDo(TimeMonitor) into Read 
co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.498Z: Fusing consumer Window.Into()2/Window.Assign 
into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.535Z: Fusing consumer 
CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into 
Window.Into()2/Window.Assign
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.557Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets 
into CoGroupByKey/GBK/ReadStream
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.592Z: Fusing consumer 
CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into 
CoGroupByKey/GBK/MergeBuckets
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.623Z: Fusing consumer Ungroup and 
reiterate/ParMultiDo(UngroupAndReiterate) into 
CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.646Z: Fusing consumer Collect total 
bytes/ParMultiDo(ByteMonitor) into Ungroup and 
reiterate/ParMultiDo(UngroupAndReiterate)
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:52.677Z: Fusing consumer Collect end time 
metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Sep 22, 2021 12:45:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:45:53.039Z: Starting 5 ****s in us-central1-a...
Sep 22, 2021 12:46:05 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:46:03.523Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 22, 2021 12:46:24 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:46:22.700Z: Autoscaling: Raised the number of ****s to 1 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Sep 22, 2021 12:46:24 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:46:22.729Z: Resized **** pool to 1, though goal was 5.  
This could be a quota issue.
Sep 22, 2021 12:46:34 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:46:32.989Z: Autoscaling: Raised the number of ****s to 5 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Sep 22, 2021 12:47:30 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:47:29.757Z: Workers have started successfully.
Sep 22, 2021 12:47:30 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T12:47:29.788Z: Workers have started successfully.
Sep 22, 2021 2:09:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-22T14:09:53.191Z: Staged package 
amazon-kinesis-producer-0.14.1-5YYT_oyDFV17VWs8Of43O7N08sRfHu0SuJeqVSFCuLg.jar 
at location 
'gs://temp-storage-for-perf-tests/loadtests/staging/amazon-kinesis-producer-0.14.1-5YYT_oyDFV17VWs8Of43O7N08sRfHu0SuJeqVSFCuLg.jar'
 is inaccessible.
Sep 22, 2021 2:09:57 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-22T14:09:57.035Z: One or more access checks for temp location 
or staged files failed. Please refer to other error messages for details. For 
more information on security and permissions, please see 
https://cloud.google.com/dataflow/security-and-permissions.
Sep 22, 2021 4:00:29 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T16:00:27.380Z: Cancel request is committed for workflow job: 
2021-09-22_05_45_38-8630686692093343177.
Sep 22, 2021 4:00:29 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T16:00:27.416Z: Cleaning up.
Sep 22, 2021 4:00:29 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T16:00:27.491Z: Stopping **** pool...
Sep 22, 2021 4:00:29 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T16:00:27.565Z: Stopping **** pool...
Sep 22, 2021 4:02:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T16:02:44.163Z: Autoscaling: Reduced the number of ****s to 0 
based on low average **** CPU utilization, and the pipeline having sufficiently 
low backlog and keeping up with input rate.
Sep 22, 2021 4:02:44 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-22T16:02:44.201Z: Worker pool stopped.
Sep 22, 2021 4:02:49 PM org.apache.beam.runners.dataflow.DataflowPipelineJob 
logTerminalState
INFO: Job 2021-09-22_05_45_38-8630686692093343177 finished with status 
CANCELLED.
Load test results for test (ID): 7e9be1e1-4f39-4908-9b1d-ea8004aa235b and 
timestamp: 2021-09-22T12:45:33.904000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11535.738
dataflow_v2_java11_total_bytes_count             2.92455941E10
Exception in thread "main" java.lang.RuntimeException: Invalid job state: 
CANCELLED.
        at 
org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
        at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
        at 
org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
        at 
org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210922124335
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:eff74314ab303743ab8314e947d1999c5f0b633378bc0789128ffce6bf27b20a
Deleted: sha256:13456361d1b2fbc317b58dedd4c54c27ac5f742edfd3eaccc5528e9ac6160e94
Deleted: sha256:b3bf7ec2f7552e27427a868c7b901bed80a7bb312c9a68532f488a9da140f4b9
Deleted: sha256:76d19613a00da22afdd6e5ad3516ce02e0d388dc69fd7795c0a5cdc63bc52f75
Deleted: sha256:70181a744720f29fd5615ec56fce13bcf27a967405a0483ad3df1afc59df016f
Deleted: sha256:cf1acda140f19415c74687bb2abba4eea28fbb9a8c91b1c5854c6bb8f13e8850
Deleted: sha256:687d5a5d634adc80adb4a9701440a2954bf8dd4f4c691e17030fe0119ea79ff6
Deleted: sha256:ee219303f9b8e622d62b7a53aa16cf9bc8b25e2bdbe5ee38d272b32b488de3af
Deleted: sha256:f7787378d4113a51c42376eb1a1b368d2400dbac8537c078e881c520bbb0df79
Deleted: sha256:c5263b87bad64aa125a9ca3cc37fed078f7fa58066dc96056bc87cf5a449739a
Deleted: sha256:7dff09d37c5974b025478c8995f588bb466b8242785edbcd5138236ff37e9b9e
Deleted: sha256:a55477630e507e10750021b0e73ed7ffa3c7fa74cef86c423723c4eb74854aba
Deleted: sha256:1fecfd038bcaf4276e8ede41605ed687b653e06820a6883d224d5f2d12c43fcd
Deleted: sha256:7669ab5ce7d9ace61d245839652dab2eedc4d767598ee2d1e95a8dd76c24f701
Deleted: sha256:5c48581ce374529e720e84ffa0c20919f66f571df54cde3277ce2a377e37c03f
Deleted: sha256:c09a7fa64d597b371f04e4365e8dfe7ebd8d2f17bbf699d9f0ba3eea6c1db9ed
Deleted: sha256:d3c0aa9f2368fbafb5b1110731164576329b49c98fa93466b68b0f8663e3cb92
Deleted: sha256:784cc9603582840701620872deac520f54d9ac58dc863137bbdaf8211c45e988
Deleted: sha256:8d85aa290abd9dd675feec91a8e7b836080b4fd3be512d212156b5616c8b083d
Deleted: sha256:7cb2d62743b08bc8c22a0a5998280efab6cb0e9b4e2b1a9a8940c7d50367e898
Deleted: sha256:d3fcd50dfe22262d70cd51149d5cb9ef7f7dcf5ffdf100d855ba5e4a15415401
Deleted: sha256:6bfd67221536ed73ac3b1948d682ab8287a85662e4b2431b22a6906d4f149409
Deleted: sha256:810a2f68192eee2fbdd5a32f6cb13b6e47dbffaf3a80913bcd23c6ffcf15d620
Deleted: sha256:02e087a3cd8fff999a51746db2e1fd1a4ba9bac8b53e813a9b97b25c598f89a1
Deleted: sha256:2b18d073580d2e89148ce234a7b49f779c134587f8dfe49b4c4ad0a499113fd2
Deleted: sha256:d2f785efc30adaf5d0ebf3cd41162fd88c1b78f5f6884d11e6741536061c6e6b
Deleted: sha256:7df056af9fdca24107e272aec2cbbc8d5b930657364c10b8460dee67c37fb09a
Deleted: sha256:b6c63cd83a11dd4c8aa3449be982e3f3acfe3ad3acfb16a564100063e5ef967c
Deleted: sha256:76a645bf1651ec0c5e745edc178a4fa533b2fdfe4ff2e109c4d37336b898d60d
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210922124335]
- referencing digest: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:eff74314ab303743ab8314e947d1999c5f0b633378bc0789128ffce6bf27b20a]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210922124335] 
(referencing 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:eff74314ab303743ab8314e947d1999c5f0b633378bc0789128ffce6bf27b20a])].
Removing untagged image 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:eff74314ab303743ab8314e947d1999c5f0b633378bc0789128ffce6bf27b20a
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:eff74314ab303743ab8314e947d1999c5f0b633378bc0789128ffce6bf27b20a
Deleted 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:eff74314ab303743ab8314e947d1999c5f0b633378bc0789128ffce6bf27b20a].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 33s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/r4w6jff3bwbha

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to