See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/180/display/redirect?page=changes>
Changes: [Daniel Oliveira] [BEAM-13321] Pass TempLocation as pipeline option to Dataflow Go for [Robert Bradshaw] Better type hints for Count combiners. [Kyle Weaver] Include name of missing tag in error message. [stranniknm] [BEAM-13423]: fix frontend failure if no examples [daria.malkova] change return type of 2 methods [mmack] [BEAM-13441] Use quiet delete for S3 batch deletes. In quiet mode only [noreply] Updating Grafana from v8.1.2 to v8.1.6 [daria.malkova] Docs for validators tests [daria.malkova] change context type [noreply] Merge pull request #16140 from [BEAM-13377][Playground] Update CI/CD [noreply] Merge pull request #16120 from [BEAM-13333][Playground] Save Python logs [noreply] Merge pull request #16185 from [BEAM-13425][Playground][Bugfix] Support [mmack] [BEAM-13445] Correctly set data limit when flushing S3 upload buffer and [noreply] Merge pull request #16121 from [BEAM-13334][Playground] Save Go logs to [noreply] Merge pull request #16179 from [BEAM-13344][Playground] support python [noreply] Merge pull request #16208 from [BEAM-13442][Playground] Filepath to log [noreply] [BEAM-13276] bump jackson-core to 2.13.0 for .test-infra (#16062) [noreply] Change Pub/Sub Lite PollResult to set explicit watermark (#16216) [noreply] [BEAM-13454] Fix and test dataframe read_fwf. (#16064) [noreply] [BEAM-12976] Pipeline visitor to discover pushdown opportunities. [noreply] [BEAM-13015] Allow decoding a set of elements until we hit the block ------------------------------------------ [...truncated 48.58 KB...] 91f7336bbfff: Waiting 78c3a7b74ad8: Waiting 1a7bf77856fc: Waiting 5626069a74e0: Waiting e2e8c39e0f77: Waiting db60aa4405a5: Pushed 3ac9c2fecc70: Pushed cb89b8924d43: Pushed 752b3aca70b1: Pushed afa057d8b3b8: Pushed 19bab224b506: Pushed 28c351f2219b: Pushed 78c3a7b74ad8: Pushed 5c81f9330d99: Layer already exists 927f9fcef4cf: Layer already exists a81f1846a0d2: Layer already exists 3b441d7cb46b: Layer already exists d3710de04cb3: Layer already exists aeef33a16417: Pushed 0028bf7c6381: Pushed 91f7336bbfff: Layer already exists e2e8c39e0f77: Layer already exists 1a7bf77856fc: Pushed 5626069a74e0: Pushed 20211214124339: digest: sha256:45bb2db85b92cc45e097bd29bc34b54a987268c83de0688d5487a1df2fb46615 size: 4311 > Task :sdks:java:testing:load-tests:run Dec 14, 2021 12:45:58 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create INFO: No stagingLocation provided, falling back to gcpTempLocation Dec 14, 2021 12:45:59 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 203 files. Enable logging at DEBUG level to see which files will be staged. Dec 14, 2021 12:45:59 PM org.apache.beam.sdk.Pipeline validate WARNING: The following transforms do not have stable unique names: Window.Into() Dec 14, 2021 12:45:59 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. Dec 14, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location to prepare for execution. Dec 14, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 seconds Dec 14, 2021 12:46:02 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/ Dec 14, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <113691 bytes, hash cfa809eda01b4a61e8be5729d895f703dd9635230eb18f9a8af2c0a8d40a307b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-z6gJ7aAbSmHovlcp2JX3A92WNSMOsY-aivLAqNQKMHs.pb Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1 Dec 14, 2021 12:46:04 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@429f7919, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a2929a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@cda6019, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@797c3c3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4012d5bc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4375b013, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1cf0cacc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f5b08d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@529c2a9a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c87fdf2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26bbe604, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe34b86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c98781a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3f736a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4601203a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53abfc07, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2c8c16c0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@80bfa9d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@47c40b56, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b039c6d] Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/StripIds as step s2 Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect start time metrics (input) as step s3 Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Window.Into()/Window.Assign as step s4 Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5 Dec 14, 2021 12:46:04 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65e0b505, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67de7a99, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@795f5d51, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@34aeacd1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@54067fdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4098dd77, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43aeb5e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2274160, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65383667, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63cd2cd2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@557a84fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6deee370, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49c17ba4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43e065f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@423c5404, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5853ca50, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a0d96a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a02bfe3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a3e5cd3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c79088e] Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read co-input/StripIds as step s6 Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect start time metrics (co-input) as step s7 Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Window.Into()2/Window.Assign as step s8 Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9 Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10 Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/Flatten as step s11 Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/GBK as step s12 Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13 Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Ungroup and reiterate as step s14 Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect total bytes as step s15 Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect end time metrics as step s16 Dec 14, 2021 12:46:04 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Dataflow SDK version: 2.36.0-SNAPSHOT Dec 14, 2021 12:46:05 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-14_04_46_04-10847864043267183166?project=apache-beam-testing Dec 14, 2021 12:46:05 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Submitted job: 2021-12-14_04_46_04-10847864043267183166 Dec 14, 2021 12:46:05 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel > --region=us-central1 2021-12-14_04_46_04-10847864043267183166 Dec 14, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2021-12-14T12:46:13.628Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-12-rstx. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions Dec 14, 2021 12:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:23.055Z: Worker configuration: e2-standard-2 in us-central1-a. Dec 14, 2021 12:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:24.009Z: Expanding SplittableParDo operations into optimizable parts. Dec 14, 2021 12:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:24.028Z: Expanding CollectionToSingleton operations into optimizable parts. Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:24.101Z: Expanding CoGroupByKey operations into optimizable parts. Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:24.178Z: Expanding SplittableProcessKeyed operations into optimizable parts. Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:24.210Z: Expanding GroupByKey operations into streaming Read/Write steps Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:24.285Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:24.396Z: Fusing adjacent ParDo, Read, Write, and Flatten operations Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:24.459Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:24.498Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:24.537Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:24.598Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:24.622Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:24.670Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:24.716Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:24.737Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds) Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:24.765Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor) Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:24.815Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:24.845Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:24.886Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:24.916Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:24.976Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:25.014Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:25.063Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:25.095Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:25.127Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:25.158Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:25.190Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:25.221Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:25.266Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor) Dec 14, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:25.716Z: Starting 5 ****s in us-central1-a... Dec 14, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:46:55.690Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Dec 14, 2021 12:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:47:17.195Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. Dec 14, 2021 12:48:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:48:12.483Z: Workers have started successfully. Dec 14, 2021 12:48:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T12:48:12.530Z: Workers have started successfully. Dec 14, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T16:00:32.573Z: Cancel request is committed for workflow job: 2021-12-14_04_46_04-10847864043267183166. Dec 14, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T16:00:32.675Z: Cleaning up. Dec 14, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T16:00:32.805Z: Stopping **** pool... Dec 14, 2021 4:00:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T16:00:32.878Z: Stopping **** pool... Dec 14, 2021 4:02:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T16:02:58.021Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate. Dec 14, 2021 4:02:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-14T16:02:58.066Z: Worker pool stopped. Dec 14, 2021 4:03:04 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState INFO: Job 2021-12-14_04_46_04-10847864043267183166 finished with status CANCELLED. Load test results for test (ID): 13c2cfc1-d692-4f89-91f5-e3d08ce48dfa and timestamp: 2021-12-14T12:45:59.353000000Z: Metric: Value: dataflow_v2_java11_runtime_sec 11458.167 dataflow_v2_java11_total_bytes_count 9.8452935E9 Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED. at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51) at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139) at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62) at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157) > Task :sdks:java:testing:load-tests:run FAILED > Task :runners:google-cloud-dataflow-java:cleanUpDockerImages Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211214124339 Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:45bb2db85b92cc45e097bd29bc34b54a987268c83de0688d5487a1df2fb46615 Deleted: sha256:31984e3cd5f1effd1bd83c233ba0c78e20abcf7cd4e861d837ab4d3d7a601cca Deleted: sha256:7fd662f7010957ea9bf06bd7b335722f5425f5945f1b6cc6feab97ac52e8e7fd Deleted: sha256:4d4d41f751686de528528116d227644035570c1cd97b37e2a8fc057f3255a238 Deleted: sha256:b1cd886d55c654df34127fe34baea43da56864e78bd8334e683ae77875d579ee Deleted: sha256:6cac39d37c9cd6d832a471a67b21f82cea908b29ed529d18a28e98fb71187c38 Deleted: sha256:abc187c15a666a531d7b847e877975e16c07725ef8767bfe06da04752e67a56e Deleted: sha256:32ff2e0b2822bc7a8a2071756d89f88ade4ce577aed6bf7a7ce31d90ba210748 Deleted: sha256:27bdec2438eaa53e37f2b4a553f9707b0d3bdecfae90f21a68cac6c523940144 Deleted: sha256:7e8e36f085f7cf3e8956b3bfc793a2b02a56c2ac02dc59f085176da8671725a3 Deleted: sha256:93a07ba63b9743f9a6cada9153e29b0e758df3bcff1fb2a90841c118da9edd44 Deleted: sha256:f50a1ee842578cc7b5f3d913bee9a4fbb4d86328d385d08089531af1d3d4f5b8 Deleted: sha256:e199d0e1d081f216364c30f580245c8cdb9db6e689151bc793cb2cda03e9f397 Deleted: sha256:42892fa12be941b704b86cb93cc9e2139322b7c45f4089b886ca89ea055ee6a6 Deleted: sha256:ff7bf5914e1b701f46657e3bcf9b6f6a6fdaff2c75b540bc12ff3f054321747a Deleted: sha256:46d1b1c2bff7d6ade58020d09acacc3bc18d3b02622d0ea58722c8c996ecf650 Deleted: sha256:f814d152d91795c598d29fb102db03bbf5151b48a024179e2aae3f59650c44a8 Deleted: sha256:6661f2432f0a2d022d406c9a506f7cbc19fa30a84d0a692fffe816142af49cb8 Deleted: sha256:e261346411209ddcad0986f86b18732b00d752be3e07f8712c5cc9a83e9d68c0 Deleted: sha256:c3b179df6819bb1c89c64d2323066dda947d731827c0fc910b1771fd3babf718 Deleted: sha256:97bce8c173e989faa9c6b4a92db55d37e3c7300324a2c3787328833058e8310e Deleted: sha256:50f45e4b8146d82054d62f41c00de62a55d34918ecbc435fe6771daacb327c8d Deleted: sha256:b72af5263a1234b9da845a5ed2eec5385fb5af6bafda0ef97ed62716b6a1daf8 Deleted: sha256:6fff26c6d7afe5201f873571c0985c804e842f6d38c75668705dc18afba2060a Deleted: sha256:30bfe25a08623fd67fa95e4486c93c14b2f2e42a6c52f2dfde8f711953017b17 Deleted: sha256:e8f9be52dfdcf23f1ad37027bf2ecbcbf928be8e2ac3a11b34e1f37e26911cdb Deleted: sha256:8712d68b22b4e09124a8acea3fe418cd751713e1c17343ea7d26fa783c834a28 Deleted: sha256:5cb8d21e7d1bb69f836eed28a20d53e4aa42076bdd1484ee9fc65e4e1ff5cc22 Deleted: sha256:033cdcc5ad9788237d1333d911b1bbda210747136cbf0e29a4500410f955db70 Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211214124339] - referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:45bb2db85b92cc45e097bd29bc34b54a987268c83de0688d5487a1df2fb46615] Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211214124339] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:45bb2db85b92cc45e097bd29bc34b54a987268c83de0688d5487a1df2fb46615])]. Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:45bb2db85b92cc45e097bd29bc34b54a987268c83de0688d5487a1df2fb46615 Digests: - us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:45bb2db85b92cc45e097bd29bc34b54a987268c83de0688d5487a1df2fb46615 Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:45bb2db85b92cc45e097bd29bc34b54a987268c83de0688d5487a1df2fb46615]. FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:testing:load-tests:run'. > Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with > non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 3h 19m 45s 101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date Publishing build scan... https://gradle.com/s/dj6jyepho2omm Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
