See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/76/display/redirect?page=changes>
Changes: [emilyye] sync nltk, orjson for Python image [noreply] Allow `google-auth < 3` [samuelw] [BEAM-12776] Change closing to happen in background in parallel for [Luke Cwik] [BEAM-12802] Refactor DataStreamsDecoder so that it becomes aware of the [noreply] Fix typo in BigQuery documentation [ajamato] [BEAM-11994] Instantiate a new ServiceCallMetric before each request to [Ankur Goenka] Remove duplicate 2.33.0 section [Steve Niemitz] [BEAM-12767] Improve PipelineOption parsing UX [noreply] add python spark example in documentation (#15426) [noreply] Add per-batch metrics to JdbcIO.write (#15429) ------------------------------------------ [...truncated 50.92 KB...] 05103deb4558: Layer already exists a881cfa23a78: Layer already exists 46553dbf9a6a: Pushed 825b5c390d9a: Pushed f96bb7a1c058: Pushed 20210901124331: digest: sha256:e9de3c0384410542c5a5defa5e66c022340c088ca41ffa8c8eb23306f158beba size: 4311 > Task :sdks:java:testing:load-tests:run Sep 01, 2021 12:45:21 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create INFO: No stagingLocation provided, falling back to gcpTempLocation Sep 01, 2021 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 191 files. Enable logging at DEBUG level to see which files will be staged. Sep 01, 2021 12:45:22 PM org.apache.beam.sdk.Pipeline validate WARNING: The following transforms do not have stable unique names: Window.Into() Sep 01, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. Sep 01, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/ Sep 01, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <112622 bytes, hash 942ff591edb74b2d891525494d5c7cd9d7ebe325e7da6d5146bd451cd4dc9eb5> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-lC_1ke23Sy2JFSVJTVx82dfr4yXn2m1RRr1FHNTcnrU.pb Sep 01, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Uploading 191 files from PipelineOptions.filesToStage to staging location to prepare for execution. Sep 01, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 191 files cached, 0 files newly uploaded in 0 seconds Sep 01, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1 Sep 01, 2021 12:45:27 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1f346ad2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a145ba, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ab34619, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae2db25, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@363c4251, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7afc4db9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a1f5f71, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63884e4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@524270b8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4acb7ecc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a4f5433, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6812fa3a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29149030, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@38b8b6c0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@68868328, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@173a6728, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a22e0ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67514bdd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3b920bdc, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f5538a1] Sep 01, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/StripIds as step s2 Sep 01, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect start time metrics (input) as step s3 Sep 01, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Window.Into()/Window.Assign as step s4 Sep 01, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5 Sep 01, 2021 12:45:27 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a4b5ce3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f5b6e78, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b4eced1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@71926a36, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@216e9ca3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@75120e58, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48976e6d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a367e93, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f6874f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a6dc589, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@697a34af, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@70211df5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c5228e7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@38e7ed69, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@806996, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@78b612c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@257e0827, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@22752544, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@21ba2445, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@69d23296] Sep 01, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read co-input/StripIds as step s6 Sep 01, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect start time metrics (co-input) as step s7 Sep 01, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Window.Into()2/Window.Assign as step s8 Sep 01, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9 Sep 01, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10 Sep 01, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/Flatten as step s11 Sep 01, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/GBK as step s12 Sep 01, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13 Sep 01, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Ungroup and reiterate as step s14 Sep 01, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect total bytes as step s15 Sep 01, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect end time metrics as step s16 Sep 01, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Dataflow SDK version: 2.34.0-SNAPSHOT Sep 01, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-01_05_45_27-7406839887615416035?project=apache-beam-testing Sep 01, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Submitted job: 2021-09-01_05_45_27-7406839887615416035 Sep 01, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel > --region=us-central1 2021-09-01_05_45_27-7406839887615416035 Sep 01, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2021-09-01T12:45:35.772Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-3eql. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions Sep 01, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:39.524Z: Worker configuration: e2-standard-2 in us-central1-a. Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:40.160Z: Expanding SplittableParDo operations into optimizable parts. Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:40.181Z: Expanding CollectionToSingleton operations into optimizable parts. Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:40.248Z: Expanding CoGroupByKey operations into optimizable parts. Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:40.319Z: Expanding SplittableProcessKeyed operations into optimizable parts. Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:40.340Z: Expanding GroupByKey operations into streaming Read/Write steps Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:40.409Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:40.508Z: Fusing adjacent ParDo, Read, Write, and Flatten operations Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:40.538Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:40.575Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:40.594Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:40.616Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:40.637Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:40.665Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:40.700Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:40.732Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds) Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:40.778Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor) Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:40.805Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:40.827Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:40.854Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:40.888Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:40.922Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:40.954Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:40.990Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:41.021Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:41.055Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:41.085Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:41.110Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:41.166Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) Sep 01, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:41.187Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor) Sep 01, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:41.541Z: Starting 5 ****s in us-central1-a... Sep 01, 2021 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:45:59.189Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Sep 01, 2021 12:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:46:26.997Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. Sep 01, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:46:55.773Z: Workers have started successfully. Sep 01, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T12:46:55.805Z: Workers have started successfully. Sep 01, 2021 12:51:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process SEVERE: 2021-09-01T12:51:41.706Z: generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events. passed through: ==> dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:1089 generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events. passed through: ==> dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:1089 generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events. passed through: ==> dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:1089 generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events. passed through: ==> dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:1089 generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events. passed through: ==> dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:1089 generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events. passed through: ==> dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:1089 generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events. passed through: ==> dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:1089 Sep 01, 2021 12:52:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process SEVERE: 2021-09-01T12:52:50.729Z: generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events. passed through: ==> dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:1089 generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events. passed through: ==> dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:1089 Sep 01, 2021 12:52:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process SEVERE: 2021-09-01T12:52:52.787Z: generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events. passed through: ==> dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:1089 generic::internal: The work item requesting state read is no longer valid on the backend. The work has already completed or will be retried. This is expected during autoscaling events. passed through: ==> dist_proc/dax/workflow/****/streaming/streaming_rpc_windmill_service_server.cc:1089 Sep 01, 2021 4:00:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T16:00:26.277Z: Cancel request is committed for workflow job: 2021-09-01_05_45_27-7406839887615416035. Sep 01, 2021 4:00:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T16:00:26.339Z: Cleaning up. Sep 01, 2021 4:00:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T16:00:26.400Z: Stopping **** pool... Sep 01, 2021 4:00:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T16:00:26.445Z: Stopping **** pool... Sep 01, 2021 4:02:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T16:02:44.075Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate. Sep 01, 2021 4:02:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-01T16:02:44.106Z: Worker pool stopped. Sep 01, 2021 4:02:50 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState INFO: Job 2021-09-01_05_45_27-7406839887615416035 finished with status CANCELLED. Load test results for test (ID): 619ef2be-01db-4759-837f-40d3f5c2590b and timestamp: 2021-09-01T12:45:22.205000000Z: Metric: Value: dataflow_v2_java11_runtime_sec 11549.051 dataflow_v2_java11_total_bytes_count 2.49502246E10 Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED. at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55) at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141) at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62) at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157) > Task :sdks:java:testing:load-tests:run FAILED > Task :runners:google-cloud-dataflow-java:cleanUpDockerImages Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210901124331 Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e9de3c0384410542c5a5defa5e66c022340c088ca41ffa8c8eb23306f158beba Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210901124331] - referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e9de3c0384410542c5a5defa5e66c022340c088ca41ffa8c8eb23306f158beba] Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210901124331] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e9de3c0384410542c5a5defa5e66c022340c088ca41ffa8c8eb23306f158beba])]. Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e9de3c0384410542c5a5defa5e66c022340c088ca41ffa8c8eb23306f158beba Digests: - us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e9de3c0384410542c5a5defa5e66c022340c088ca41ffa8c8eb23306f158beba Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e9de3c0384410542c5a5defa5e66c022340c088ca41ffa8c8eb23306f158beba]. FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:testing:load-tests:run'. > Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with > non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 3h 19m 36s 101 actionable tasks: 73 executed, 26 from cache, 2 up-to-date Publishing build scan... https://gradle.com/s/jasppfcaup67w Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
