See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/233/display/redirect?page=changes>
Changes: [mmack] [BEAM-13663] Remove unused duplicate option for AWS client configuration [mmack] [BEAM-13203] Deprecate SnsIO.writeAsync for AWS Sdk v2 due to risk of [noreply] [BEAM-13828] Fix stale bot (#16734) [noreply] Merge pull request #16364 from [BEAM-13182] Add diagrams to backend [noreply] [BEAM-13811] Fix save_main_session arg in tests examples (#16709) [Kiley Sok] Update beam-master version [noreply] [BEAM-13015] Calculate exception for closing BeamFnDataInboundObserver2 [noreply] Minor doc tweaks for validating vendoring. (#16747) [noreply] [BEAM-13686] OOM while logging a large pipeline even when logging level [noreply] [BEAM-13629] Update URL artifact type for Dataflow Go (#16490) [noreply] [BEAM-13832] Add automated expansion service start-up to JDBCio (#16739) [noreply] [BEAM-13831] Add automated expansion service infra into Debezium Read() [noreply] [BEAM-13821] Add automated expansion service start-up to KafkaIO [noreply] [BEAM-13799] Created a Dataproc cluster manager for Interactive Beam [noreply] Merge pull request #16727: [BEAM-11971] remove unsafe Concurrent data ------------------------------------------ [...truncated 49.78 KB...] bd18a1a09476: Preparing 6f78efdc0a6b: Preparing 352660b137e6: Preparing 4590f8c89770: Preparing ca33502d2cac: Preparing 0aa3674558b5: Preparing 7c072cee6a29: Preparing 1e5fdc3d671c: Preparing 613ab28cf833: Preparing bed676ceab7a: Preparing 6398d5cccd2c: Preparing 84954f958ede: Waiting 0aa3674558b5: Waiting 2a5e893f830d: Waiting bd18a1a09476: Waiting 352660b137e6: Waiting 7c072cee6a29: Waiting ca33502d2cac: Waiting 4590f8c89770: Waiting 6f78efdc0a6b: Waiting 1e5fdc3d671c: Waiting 0b0f2f2f5279: Preparing 69984deac861: Waiting 6398d5cccd2c: Waiting 0b0f2f2f5279: Waiting c0c80369275f: Pushed ea7e0dd8d579: Pushed eafcc0a37937: Pushed 0bc31fcf22e4: Pushed 84954f958ede: Pushed 73c3bea82b73: Pushed 2a5e893f830d: Pushed 6f78efdc0a6b: Pushed bd18a1a09476: Pushed 0aa3674558b5: Layer already exists 7c072cee6a29: Layer already exists 4590f8c89770: Pushed 1e5fdc3d671c: Layer already exists 613ab28cf833: Layer already exists bed676ceab7a: Layer already exists 69984deac861: Pushed 6398d5cccd2c: Layer already exists 0b0f2f2f5279: Layer already exists ca33502d2cac: Pushed 352660b137e6: Pushed 20220205125145: digest: sha256:055c05273d506a6e8284e92fc3428bb39a7bb498d9237f4ba1aa6b63ac7d017b size: 4520 > Task :sdks:java:testing:load-tests:run Feb 05, 2022 12:54:32 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create INFO: No stagingLocation provided, falling back to gcpTempLocation Feb 05, 2022 12:54:33 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 204 files. Enable logging at DEBUG level to see which files will be staged. Feb 05, 2022 12:54:34 PM org.apache.beam.sdk.Pipeline validate WARNING: The following transforms do not have stable unique names: Window.Into() Feb 05, 2022 12:54:34 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. Feb 05, 2022 12:54:37 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Uploading 204 files from PipelineOptions.filesToStage to staging location to prepare for execution. Feb 05, 2022 12:54:39 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 204 files cached, 0 files newly uploaded in 1 seconds Feb 05, 2022 12:54:39 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/ Feb 05, 2022 12:54:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <114077 bytes, hash da1da67ee3a7a67940df1af68529f9f579d6853ec42a1c3e292b590edd29193b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-2h2mfuOnpnlA3xr2hSn59XnWhT7EKhw-KStZDt0pGTs.pb Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1 Feb 05, 2022 12:54:41 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@574a89e2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e1e9ef3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3dd31157, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31c628e7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3240b2a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58434b19, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d3fb0ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7dbe2ebf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4adc663e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@885e7ff, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@8bd86c8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4fa9ab6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d3ef181, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a2341c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e4c0d8c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e3315d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@64db4967, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@74e6094b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a485a36, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5cf3157b] Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/StripIds as step s2 Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect start time metrics (input) as step s3 Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Window.Into()/Window.Assign as step s4 Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5 Feb 05, 2022 12:54:41 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c5228e7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@38e7ed69, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@806996, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@78b612c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@257e0827, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@22752544, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@21ba2445, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@69d23296, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c3820bb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@376c7d7d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4784efd9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3fba233d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@427ae189, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@16a9eb2e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@76332405, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@187e5235, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@d1d8e1a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5434e40c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3b48e183, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@514de325] Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read co-input/StripIds as step s6 Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect start time metrics (co-input) as step s7 Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Window.Into()2/Window.Assign as step s8 Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9 Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10 Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/Flatten as step s11 Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/GBK as step s12 Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13 Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Ungroup and reiterate as step s14 Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect total bytes as step s15 Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect end time metrics as step s16 Feb 05, 2022 12:54:41 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Dataflow SDK version: 2.37.0-SNAPSHOT Feb 05, 2022 12:54:42 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-05_04_54_41-13484680452619416034?project=apache-beam-testing Feb 05, 2022 12:54:42 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Submitted job: 2022-02-05_04_54_41-13484680452619416034 Feb 05, 2022 12:54:42 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel > --region=us-central1 2022-02-05_04_54_41-13484680452619416034 Feb 05, 2022 12:54:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2022-02-05T12:54:49.223Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-02-9qkg. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions Feb 05, 2022 12:54:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:53.857Z: Worker configuration: e2-standard-2 in us-central1-b. Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:54.395Z: Expanding SplittableParDo operations into optimizable parts. Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:54.442Z: Expanding CollectionToSingleton operations into optimizable parts. Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:54.520Z: Expanding CoGroupByKey operations into optimizable parts. Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:54.636Z: Expanding SplittableProcessKeyed operations into optimizable parts. Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:54.713Z: Expanding GroupByKey operations into streaming Read/Write steps Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:54.759Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:54.869Z: Fusing adjacent ParDo, Read, Write, and Flatten operations Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:54.908Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:54.973Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:55.004Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:55.037Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:55.079Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:55.120Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:55.166Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing Feb 05, 2022 12:54:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:55.194Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds) Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:55.228Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor) Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:55.249Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:55.276Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:55.303Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:55.336Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:55.367Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:55.402Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:55.437Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:55.474Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:55.504Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:55.539Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:55.573Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:55.608Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:55.643Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor) Feb 05, 2022 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:54:56.146Z: Starting 5 ****s in us-central1-b... Feb 05, 2022 12:55:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:55:06.874Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Feb 05, 2022 12:55:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:55:40.608Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. Feb 05, 2022 12:56:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:56:41.210Z: Workers have started successfully. Feb 05, 2022 12:56:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T12:56:41.244Z: Workers have started successfully. Feb 05, 2022 4:00:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T16:00:45.291Z: Cancel request is committed for workflow job: 2022-02-05_04_54_41-13484680452619416034. Feb 05, 2022 4:00:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T16:00:45.365Z: Cleaning up. Feb 05, 2022 4:00:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T16:00:45.423Z: Stopping **** pool... Feb 05, 2022 4:00:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T16:00:45.465Z: Stopping **** pool... Feb 05, 2022 4:03:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T16:03:14.878Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate. Feb 05, 2022 4:03:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-05T16:03:15.058Z: Worker pool stopped. Feb 05, 2022 4:03:23 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState INFO: Job 2022-02-05_04_54_41-13484680452619416034 finished with status CANCELLED. Load test results for test (ID): 6a3f8a7b-51ed-4375-acb8-a03a87edb427 and timestamp: 2022-02-05T12:54:33.571000000Z: Metric: Value: dataflow_v2_java11_runtime_sec 11018.873 dataflow_v2_java11_total_bytes_count 1.31740493E10 Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED. at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51) at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139) at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62) at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157) > Task :sdks:java:testing:load-tests:run FAILED > Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220205125145 Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:055c05273d506a6e8284e92fc3428bb39a7bb498d9237f4ba1aa6b63ac7d017b Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220205125145] - referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:055c05273d506a6e8284e92fc3428bb39a7bb498d9237f4ba1aa6b63ac7d017b] Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220205125145] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:055c05273d506a6e8284e92fc3428bb39a7bb498d9237f4ba1aa6b63ac7d017b])]. Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:055c05273d506a6e8284e92fc3428bb39a7bb498d9237f4ba1aa6b63ac7d017b Digests: - us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:055c05273d506a6e8284e92fc3428bb39a7bb498d9237f4ba1aa6b63ac7d017b Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:055c05273d506a6e8284e92fc3428bb39a7bb498d9237f4ba1aa6b63ac7d017b]. FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:testing:load-tests:run'. > Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with > non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0. You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins. See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness. Please consult deprecation warnings for more details. BUILD FAILED in 3h 12m 5s 109 actionable tasks: 72 executed, 33 from cache, 4 up-to-date Publishing build scan... https://gradle.com/s/hegfl4wykygb2 Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
