See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java17/52/display/redirect?page=changes>
Changes: [noreply] Update README.md [marco.robles] Update README with latest PreCommit Jobs [marco.robles] Update Postcommit jobs with latest jobs [marco.robles] Update Performace job tests in readme [marco.robles] update load job tests with latest updates [marco.robles] update other jobs test with latest updates [marco.robles] mismatch links fix [marco.robles] update trigger phrase for some postCommit jobs [marco.robles] correct trigger phrases in readme [marco.robles] add pending jobs to readme [noreply] Update README.md [mmack] [BEAM-13246] Add support for S3 Bucket Key at the object level (AWS Sdk [Pablo Estrada] Output successful rows from BQ Streaming Inserts [schapman] BEAM-13439 Type annotation for ptransform_fn [noreply] [BEAM-13606] Fail bundles with failed BigTable mutations (#16751) [mmack] [adhoc] Remove remaining usage of Powermock from aws2. [marco.robles] fix broken links in jobs & remove the invalid ones [Kyle Weaver] Update Dataflow Python dev container images. [Kiley Sok] Add java 17 to changes [noreply] [BEAM-12914] Add missing 3.9 opcodes to type inference. (#16761) [noreply] [BEAM-13321] Initial BigQueryIO externalization. (#16489) [noreply] [BEAM-13193] Enable process bundle response elements embedding in Java [noreply] [BEAM-13830] added a debeziumio_expansion_addr flag to GoSDK (#16780) [noreply] Apply spotless. (#16783) [Daniel Oliveira] [BEAM-13732] Switch x-lang BigQueryIO expansion service to GCP one. [noreply] [BEAM-13858] Fix broken github action on :sdks:go:examples:wordCount [Kiley Sok] add jira for runner v2 [noreply] [BEAM-13732] Go SDK BigQuery IO wrapper. Initial implementation. [noreply] [BEAM-13732] Add example for Go BigQuery IO wrapper. (#16786) [noreply] Update CHANGES.md with Go SDK milestones. (#16787) [noreply] [BEAM-13193] Allow BeamFnDataOutboundObserver to flush elements. ------------------------------------------ [...truncated 44.85 KB...] da1552893639: Preparing 2a252c3d0a1d: Preparing 841ed3dbd32a: Preparing 92399d10a8ad: Preparing 8cc024575d60: Preparing 3a45bf8d75fc: Preparing 5331ce361341: Preparing babf5ef2e3ab: Preparing 613ab28cf833: Preparing bed676ceab7a: Preparing 6398d5cccd2c: Preparing 0b0f2f2f5279: Preparing da1552893639: Waiting babf5ef2e3ab: Waiting 613ab28cf833: Waiting bed676ceab7a: Waiting 2a252c3d0a1d: Waiting 6398d5cccd2c: Waiting 15ee06e1f7d7: Waiting 0b0f2f2f5279: Waiting 841ed3dbd32a: Waiting 112ca825abfb: Waiting 92399d10a8ad: Waiting 3a45bf8d75fc: Waiting 5331ce361341: Waiting 8cc024575d60: Waiting 23df38581584: Pushed ee1669083a8c: Pushed 0e85edd95852: Pushed 376adae3d876: Pushed da1552893639: Pushed 67b4b7ad98b5: Pushed 15ee06e1f7d7: Pushed 841ed3dbd32a: Pushed 8cc024575d60: Pushed 2a252c3d0a1d: Pushed 5331ce361341: Layer already exists babf5ef2e3ab: Layer already exists 112ca825abfb: Pushed bed676ceab7a: Layer already exists 613ab28cf833: Layer already exists 6398d5cccd2c: Layer already exists 0b0f2f2f5279: Layer already exists 3a45bf8d75fc: Pushed 92399d10a8ad: Pushed 20220209123414: digest: sha256:d8b4146d1cb4685782b97141ab1f5f215afee4a32cd804296233e60998c23adb size: 4314 > Task :sdks:java:testing:load-tests:run Feb 09, 2022 12:36:24 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create INFO: No stagingLocation provided, falling back to gcpTempLocation Feb 09, 2022 12:36:24 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 204 files. Enable logging at DEBUG level to see which files will be staged. Feb 09, 2022 12:36:25 PM org.apache.beam.sdk.Pipeline validate WARNING: The following transforms do not have stable unique names: Window.Into() Feb 09, 2022 12:36:25 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. Feb 09, 2022 12:36:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Uploading 204 files from PipelineOptions.filesToStage to staging location to prepare for execution. Feb 09, 2022 12:36:28 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 204 files cached, 0 files newly uploaded in 0 seconds Feb 09, 2022 12:36:28 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/ Feb 09, 2022 12:36:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <114077 bytes, hash f3d1f1139e5e054fdba3d9ee7a54ed7fb7be3bb3faf83a7db79251b6d9848e02> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-89HxE55eBU_bo9nuelTtf7e-O7P6-Dp9t5JRttmEjgI.pb Feb 09, 2022 12:36:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1 Feb 09, 2022 12:36:30 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3dd31157, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31c628e7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3240b2a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58434b19, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d3fb0ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7dbe2ebf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4adc663e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@885e7ff, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@8bd86c8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4fa9ab6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d3ef181, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a2341c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e4c0d8c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e3315d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@64db4967, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@74e6094b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a485a36, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5cf3157b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@625dfff3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26350ea2] Feb 09, 2022 12:36:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/StripIds as step s2 Feb 09, 2022 12:36:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect start time metrics (input) as step s3 Feb 09, 2022 12:36:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Window.Into()/Window.Assign as step s4 Feb 09, 2022 12:36:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5 Feb 09, 2022 12:36:30 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@806996, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@78b612c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@257e0827, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@22752544, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@21ba2445, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@69d23296, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c3820bb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@376c7d7d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4784efd9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3fba233d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@427ae189, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@16a9eb2e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@76332405, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@187e5235, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@d1d8e1a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5434e40c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3b48e183, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@514de325, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@30c1da48, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a65cd8] Feb 09, 2022 12:36:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read co-input/StripIds as step s6 Feb 09, 2022 12:36:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect start time metrics (co-input) as step s7 Feb 09, 2022 12:36:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Window.Into()2/Window.Assign as step s8 Feb 09, 2022 12:36:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9 Feb 09, 2022 12:36:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10 Feb 09, 2022 12:36:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/Flatten as step s11 Feb 09, 2022 12:36:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/GBK as step s12 Feb 09, 2022 12:36:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13 Feb 09, 2022 12:36:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Ungroup and reiterate as step s14 Feb 09, 2022 12:36:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect total bytes as step s15 Feb 09, 2022 12:36:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect end time metrics as step s16 Feb 09, 2022 12:36:30 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Dataflow SDK version: 2.37.0-SNAPSHOT Feb 09, 2022 12:36:31 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-09_04_36_30-16266957799977566730?project=apache-beam-testing Feb 09, 2022 12:36:31 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Submitted job: 2022-02-09_04_36_30-16266957799977566730 Feb 09, 2022 12:36:31 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel > --region=us-central1 2022-02-09_04_36_30-16266957799977566730 Feb 09, 2022 12:36:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2022-02-09T12:36:38.855Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java170dataflow0v20streaming0cogbk01-jenkins-02-h8r. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions Feb 09, 2022 12:36:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:43.560Z: Worker configuration: e2-standard-2 in us-central1-b. Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:44.208Z: Expanding SplittableParDo operations into optimizable parts. Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:44.236Z: Expanding CollectionToSingleton operations into optimizable parts. Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:44.314Z: Expanding CoGroupByKey operations into optimizable parts. Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:44.371Z: Expanding SplittableProcessKeyed operations into optimizable parts. Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:44.393Z: Expanding GroupByKey operations into streaming Read/Write steps Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:44.463Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:44.564Z: Fusing adjacent ParDo, Read, Write, and Flatten operations Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:44.595Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:44.637Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:44.671Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:44.703Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:44.736Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:44.768Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:44.801Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:44.834Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds) Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:44.856Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor) Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:44.877Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:44.909Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:44.932Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:44.964Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:44.996Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:45.029Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:45.075Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:45.101Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:45.125Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:45.146Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:45.178Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:45.235Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) Feb 09, 2022 12:36:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:45.274Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor) Feb 09, 2022 12:36:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:45.655Z: Starting 5 ****s in us-central1-b... Feb 09, 2022 12:37:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:36:57.791Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Feb 09, 2022 12:37:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:37:26.694Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. Feb 09, 2022 12:38:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:38:26.498Z: Workers have started successfully. Feb 09, 2022 12:38:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T12:38:26.525Z: Workers have started successfully. Feb 09, 2022 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T16:00:39.049Z: Cancel request is committed for workflow job: 2022-02-09_04_36_30-16266957799977566730. Feb 09, 2022 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T16:00:39.132Z: Cleaning up. Feb 09, 2022 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T16:00:39.225Z: Stopping **** pool... Feb 09, 2022 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T16:00:39.293Z: Stopping **** pool... Feb 09, 2022 4:03:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T16:03:01.829Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate. Feb 09, 2022 4:03:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-09T16:03:01.888Z: Worker pool stopped. Feb 09, 2022 4:03:07 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState INFO: Job 2022-02-09_04_36_30-16266957799977566730 finished with status CANCELLED. Load test results for test (ID): d605d13e-dc1f-412e-9b59-2ed956956529 and timestamp: 2022-02-09T12:36:24.977000000Z: Metric: Value: dataflow_v2_java17_runtime_sec 12097.965 dataflow_v2_java17_total_bytes_count 1.9972801E10 Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED. at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51) at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139) at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62) at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157) > Task :sdks:java:testing:load-tests:run FAILED > Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220209123414 Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d8b4146d1cb4685782b97141ab1f5f215afee4a32cd804296233e60998c23adb Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220209123414] - referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d8b4146d1cb4685782b97141ab1f5f215afee4a32cd804296233e60998c23adb] Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220209123414] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d8b4146d1cb4685782b97141ab1f5f215afee4a32cd804296233e60998c23adb])]. Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d8b4146d1cb4685782b97141ab1f5f215afee4a32cd804296233e60998c23adb Digests: - us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d8b4146d1cb4685782b97141ab1f5f215afee4a32cd804296233e60998c23adb Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d8b4146d1cb4685782b97141ab1f5f215afee4a32cd804296233e60998c23adb]. FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:testing:load-tests:run'. > Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with > non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0. You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins. See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness. Please consult deprecation warnings for more details. BUILD FAILED in 3h 29m 23s 109 actionable tasks: 73 executed, 32 from cache, 4 up-to-date Publishing build scan... https://gradle.com/s/3ua6j5qbno2ju Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
