See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/91/display/redirect?page=changes>
Changes: [dpcollins] [BEAM-12882] - fix test that is flaky when jenkins is overloaded [noreply] [BEAM-12885] Enable NeedsRunner Tests for Samza Portable Runner (#15512) [noreply] [BEAM-12100][BEAM-10379][BEAM-9514][BEAM-12647][BEAM-12099] [noreply] [BEAM-12543] Fix DataFrrame typo (#15509) [noreply] [BEAM-12794] Remove obsolete uses of sys.exc_info. (#15507) [noreply] [BEAM-11666] flake on RecordingManagerTest (#15118) [kawaigin] [BEAM-10708] Introspect beam_sql output [noreply] Minor: Restore "Bugfix" section in CHANGES.md (#15516) [Kyle Weaver] [BEAM-10459] Unignore numeric aggregation tests. [ajamato] [BEAM-12898] Disable Flink Load tests which are leading Dataproc ------------------------------------------ [...truncated 48.96 KB...] The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java] 619f3fbf92b9: Preparing af3a23159c7d: Preparing 1270d47666fd: Preparing 6d0fed1f7ad0: Preparing c508d04f70e6: Preparing 2014e59c76b4: Preparing fea88fcf18b1: Preparing 57c8a67db75b: Preparing 7fb2e83f9629: Preparing 96ac1d08cb55: Preparing a42d90664913: Preparing 60b6f299bc46: Preparing 3891808a925b: Preparing d402f4f1b906: Preparing 00ef5416d927: Preparing 8555e663f65b: Preparing d00da3cd7763: Preparing 4e61e63529c2: Preparing 7fb2e83f9629: Waiting 799760671c38: Preparing 3891808a925b: Waiting d00da3cd7763: Waiting d402f4f1b906: Waiting 4e61e63529c2: Waiting 799760671c38: Waiting 00ef5416d927: Waiting 8555e663f65b: Waiting 2014e59c76b4: Waiting fea88fcf18b1: Waiting 57c8a67db75b: Waiting a42d90664913: Waiting 60b6f299bc46: Waiting c508d04f70e6: Pushed 1270d47666fd: Pushed af3a23159c7d: Pushed 2014e59c76b4: Pushed 57c8a67db75b: Pushed 619f3fbf92b9: Pushed 6d0fed1f7ad0: Pushed 7fb2e83f9629: Pushed 3891808a925b: Layer already exists d402f4f1b906: Layer already exists 00ef5416d927: Layer already exists 8555e663f65b: Layer already exists d00da3cd7763: Layer already exists 4e61e63529c2: Layer already exists a42d90664913: Pushed 799760671c38: Layer already exists 60b6f299bc46: Pushed fea88fcf18b1: Pushed 96ac1d08cb55: Pushed 20210916124428: digest: sha256:30ac48ef18e0b01215057c18fd66d15bcc4d927ca67a8575beda181b516f2adb size: 4311 > Task :sdks:java:testing:load-tests:run Sep 16, 2021 12:48:27 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create INFO: No stagingLocation provided, falling back to gcpTempLocation Sep 16, 2021 12:48:28 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged. Sep 16, 2021 12:48:30 PM org.apache.beam.sdk.Pipeline validate WARNING: The following transforms do not have stable unique names: Window.Into() Sep 16, 2021 12:48:30 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. Sep 16, 2021 12:48:34 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution. Sep 16, 2021 12:48:39 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 4 seconds Sep 16, 2021 12:48:39 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/ Sep 16, 2021 12:48:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <106569 bytes, hash c34dcce505512389ee60a4fed7f649627026e48cb41bc54621bd457366a034d5> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-w03M5QVRI4nuYKT-1_ZJYnAm5Iy0G8VGIb1Fc2agNNU.pb Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1 Sep 16, 2021 12:48:43 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1645f294, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6325f352, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15c4af7a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6cbd0674, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@55d58825, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@19a64eae, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29a98d9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da3b078, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@544e8149, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb66650, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a96d94c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@45cec376, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26d41711] Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/StripIds as step s2 Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect start time metrics (input) as step s3 Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Window.Into()/Window.Assign as step s4 Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5 Sep 16, 2021 12:48:43 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f0cab0a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe7b6b0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ab4ae59, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77681ce4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d96bdf8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f76c2cc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@306f6f1d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d7cac8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fc6deb7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@367f0121, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7da39774, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11c3ff67, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4397a639] Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read co-input/StripIds as step s6 Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect start time metrics (co-input) as step s7 Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Window.Into()2/Window.Assign as step s8 Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9 Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10 Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/Flatten as step s11 Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/GBK as step s12 Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13 Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Ungroup and reiterate as step s14 Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect total bytes as step s15 Sep 16, 2021 12:48:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect end time metrics as step s16 Sep 16, 2021 12:48:44 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Dataflow SDK version: 2.34.0-SNAPSHOT Sep 16, 2021 12:48:48 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-16_05_48_44-18227987266575069105?project=apache-beam-testing Sep 16, 2021 12:48:48 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Submitted job: 2021-09-16_05_48_44-18227987266575069105 Sep 16, 2021 12:48:48 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel > --region=us-central1 2021-09-16_05_48_44-18227987266575069105 Sep 16, 2021 12:48:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2021-09-16T12:48:54.833Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-ioq9. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions Sep 16, 2021 12:49:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:48:59.689Z: Worker configuration: e2-standard-2 in us-central1-a. Sep 16, 2021 12:49:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:00.610Z: Expanding SplittableParDo operations into optimizable parts. Sep 16, 2021 12:49:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:00.656Z: Expanding CollectionToSingleton operations into optimizable parts. Sep 16, 2021 12:49:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:00.726Z: Expanding CoGroupByKey operations into optimizable parts. Sep 16, 2021 12:49:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:00.802Z: Expanding SplittableProcessKeyed operations into optimizable parts. Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:00.836Z: Expanding GroupByKey operations into streaming Read/Write steps Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:00.916Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:01.036Z: Fusing adjacent ParDo, Read, Write, and Flatten operations Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:01.073Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:01.109Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:01.143Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:01.197Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:01.243Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:01.281Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:01.325Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:01.355Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds) Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:01.395Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor) Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:01.437Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:01.478Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:01.531Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:01.598Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:01.663Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:01.707Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:01.770Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:01.818Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:01.882Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:01.929Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:01.993Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:02.045Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:02.106Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor) Sep 16, 2021 12:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:02.757Z: Starting 5 ****s in us-central1-a... Sep 16, 2021 12:49:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:11.875Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Sep 16, 2021 12:49:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:49:47.462Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. Sep 16, 2021 12:50:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:50:16.270Z: Workers have started successfully. Sep 16, 2021 12:50:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T12:50:16.309Z: Workers have started successfully. Sep 16, 2021 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T16:00:29.931Z: Cancel request is committed for workflow job: 2021-09-16_05_48_44-18227987266575069105. Sep 16, 2021 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T16:00:30.102Z: Cleaning up. Sep 16, 2021 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T16:00:30.175Z: Stopping **** pool... Sep 16, 2021 4:00:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T16:00:30.252Z: Stopping **** pool... Sep 16, 2021 4:02:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T16:02:49.981Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate. Sep 16, 2021 4:02:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-16T16:02:50.023Z: Worker pool stopped. Sep 16, 2021 4:02:57 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState INFO: Job 2021-09-16_05_48_44-18227987266575069105 finished with status CANCELLED. Load test results for test (ID): 6af5c062-d71c-461d-816b-11e5ea70a3e0 and timestamp: 2021-09-16T12:48:29.490000000Z: Metric: Value: dataflow_v2_java11_runtime_sec 11346.163 dataflow_v2_java11_total_bytes_count 2.18755428E10 Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED. at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55) at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141) at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62) at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157) > Task :sdks:java:testing:load-tests:run FAILED > Task :runners:google-cloud-dataflow-java:cleanUpDockerImages Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210916124428 Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30ac48ef18e0b01215057c18fd66d15bcc4d927ca67a8575beda181b516f2adb Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210916124428] - referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30ac48ef18e0b01215057c18fd66d15bcc4d927ca67a8575beda181b516f2adb] Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210916124428] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30ac48ef18e0b01215057c18fd66d15bcc4d927ca67a8575beda181b516f2adb])]. Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30ac48ef18e0b01215057c18fd66d15bcc4d927ca67a8575beda181b516f2adb Digests: - us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30ac48ef18e0b01215057c18fd66d15bcc4d927ca67a8575beda181b516f2adb Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:30ac48ef18e0b01215057c18fd66d15bcc4d927ca67a8575beda181b516f2adb]. FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:testing:load-tests:run'. > Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with > non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 3h 19m 20s 101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date Publishing build scan... https://gradle.com/s/yfimtcaj4hgw4 Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
