See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/237/display/redirect?page=changes>
Changes: [david.prieto.rivera] Missing contribution [noreply] [BEAM-13803] Add support for native iterable side inputs to the Go SDK [noreply] [BEAM-11095] Better error handling for illegal emit functions (#16776) [noreply] Merge pull request #16613 from Supporting JdbcIO driver in classpath for [noreply] Merge pull request #15848 from [BEAM-13835] An any-type implementation [Valentyn Tymofieiev] [BEAM-12920] Assume that bare generators types define simple generators. [Valentyn Tymofieiev] Add a container for Python 3.9. [Valentyn Tymofieiev] Allow job submission with Python 3.9 on Dataflow runner [Valentyn Tymofieiev] Add Python 3.9 test suites. Keep Dataflow V1 suites unchanged for now. [Valentyn Tymofieiev] Add py3.9 Github actions suites. [Valentyn Tymofieiev] Py39 Doc updates. [Valentyn Tymofieiev] [BEAM-9980] Simplify run_validates_container.sh to avoid branching. [Valentyn Tymofieiev] Update Cython to a new version that has py39 wheels. [Valentyn Tymofieiev] [BEAM-13845] Fix comparison with potentially incomparable default [Valentyn Tymofieiev] [BEAM-12920] Assume that bare generators types define simple generators. [Valentyn Tymofieiev] Mark Python 3.9 as supported version. [noreply] [release-2.36.0][website] Fix github release notes script, header for [noreply] Use shell to run python for setupVirtualenv (#16796) [Daniel Oliveira] [BEAM-13830] Properly shut down Debezium expansion service in IT script. [noreply] Merge pull request #16659 from [BEAM-13774][Playground] Add user to [Valentyn Tymofieiev] [BEAM-13868] Remove gsutil dep from hdfs IT test. [noreply] [BEAM-13776][Playground] (#16731) [noreply] [BEAM-13867] Drop NaNs returned by nlargest in flight_delays example [noreply] Announce Python 3.9 in CHANGES.md (#16802) [Brian Hulette] Moving to 2.38.0-SNAPSHOT on master branch. [noreply] [BEAM-11095] Better error handling for iter/reiter/multimap (#16794) ------------------------------------------ [...truncated 47.39 KB...] 57aec383ac7b: Pushed b4164e5f025d: Pushed 535d88b6378e: Pushed 8dda956c1426: Pushed 75f72f6b56b5: Pushed a0603f3a02d3: Pushed f3e8e87a4b44: Pushed a1445b7ad2a8: Pushed d695f0110876: Pushed efb3f834d1ce: Pushed 0aa3674558b5: Layer already exists bf1de93fcdde: Pushed 7c072cee6a29: Layer already exists 1e5fdc3d671c: Layer already exists bed676ceab7a: Layer already exists 613ab28cf833: Layer already exists 6398d5cccd2c: Layer already exists 0b0f2f2f5279: Layer already exists edb67dc046f7: Pushed dd2cb0231f4d: Pushed 20220210124333: digest: sha256:d70eb28f103e3451da3491cfddc45351966ba703159dbc8e38f8a3e6a9e05426 size: 4520 > Task :sdks:java:testing:load-tests:run Feb 10, 2022 12:45:34 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create INFO: No stagingLocation provided, falling back to gcpTempLocation Feb 10, 2022 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 204 files. Enable logging at DEBUG level to see which files will be staged. Feb 10, 2022 12:45:35 PM org.apache.beam.sdk.Pipeline validate WARNING: The following transforms do not have stable unique names: Window.Into() Feb 10, 2022 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. Feb 10, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Uploading 204 files from PipelineOptions.filesToStage to staging location to prepare for execution. Feb 10, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 204 files cached, 0 files newly uploaded in 0 seconds Feb 10, 2022 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/ Feb 10, 2022 12:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <114077 bytes, hash e6bd569948c953f638f65e779a71f8956f2f2cd1860191aaf2b885e327c45633> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-5r1WmUjJU_Y49l53mnH4lW8vLNGGAZGq8riF4yfEVjM.pb Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1 Feb 10, 2022 12:45:39 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3dd31157, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31c628e7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3240b2a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58434b19, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d3fb0ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7dbe2ebf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4adc663e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@885e7ff, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@8bd86c8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4fa9ab6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d3ef181, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a2341c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e4c0d8c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e3315d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@64db4967, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@74e6094b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7a485a36, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5cf3157b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@625dfff3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26350ea2] Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/StripIds as step s2 Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect start time metrics (input) as step s3 Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Window.Into()/Window.Assign as step s4 Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5 Feb 10, 2022 12:45:39 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@806996, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@78b612c6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@257e0827, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@22752544, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@21ba2445, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@69d23296, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c3820bb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@376c7d7d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4784efd9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3fba233d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@427ae189, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@16a9eb2e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@76332405, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@187e5235, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@d1d8e1a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5434e40c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3b48e183, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@514de325, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@30c1da48, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43a65cd8] Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read co-input/StripIds as step s6 Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect start time metrics (co-input) as step s7 Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Window.Into()2/Window.Assign as step s8 Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9 Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10 Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/Flatten as step s11 Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/GBK as step s12 Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13 Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Ungroup and reiterate as step s14 Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect total bytes as step s15 Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect end time metrics as step s16 Feb 10, 2022 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Dataflow SDK version: 2.38.0-SNAPSHOT Feb 10, 2022 12:45:40 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-10_04_45_39-2704250670955367490?project=apache-beam-testing Feb 10, 2022 12:45:40 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Submitted job: 2022-02-10_04_45_39-2704250670955367490 Feb 10, 2022 12:45:40 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel > --region=us-central1 2022-02-10_04_45_39-2704250670955367490 Feb 10, 2022 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2022-02-10T12:45:50.733Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-02-6o4n. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions Feb 10, 2022 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:56.036Z: Worker configuration: e2-standard-2 in us-central1-b. Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:56.896Z: Expanding SplittableParDo operations into optimizable parts. Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:56.925Z: Expanding CollectionToSingleton operations into optimizable parts. Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:56.997Z: Expanding CoGroupByKey operations into optimizable parts. Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.066Z: Expanding SplittableProcessKeyed operations into optimizable parts. Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.095Z: Expanding GroupByKey operations into streaming Read/Write steps Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.149Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.251Z: Fusing adjacent ParDo, Read, Write, and Flatten operations Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.279Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.306Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.339Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.373Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.409Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.431Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.477Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.510Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds) Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.542Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor) Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.579Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.614Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.648Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.669Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.694Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.726Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.758Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.803Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.827Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.862Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.887Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.907Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) Feb 10, 2022 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:57.935Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor) Feb 10, 2022 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:45:58.287Z: Starting 5 ****s in us-central1-b... Feb 10, 2022 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:46:02.315Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Feb 10, 2022 12:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:46:38.629Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. Feb 10, 2022 12:47:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:47:40.557Z: Workers have started successfully. Feb 10, 2022 12:47:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T12:47:40.593Z: Workers have started successfully. Feb 10, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T16:00:36.045Z: Cancel request is committed for workflow job: 2022-02-10_04_45_39-2704250670955367490. Feb 10, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T16:00:36.110Z: Cleaning up. Feb 10, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T16:00:36.186Z: Stopping **** pool... Feb 10, 2022 4:00:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T16:00:36.239Z: Stopping **** pool... Feb 10, 2022 4:02:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T16:02:55.035Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate. Feb 10, 2022 4:02:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-02-10T16:02:55.077Z: Worker pool stopped. Feb 10, 2022 4:03:03 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState INFO: Job 2022-02-10_04_45_39-2704250670955367490 finished with status CANCELLED. Load test results for test (ID): 11464781-aed8-45cf-91b2-a0e767eaa5bb and timestamp: 2022-02-10T12:45:34.858000000Z: Metric: Value: dataflow_v2_java11_runtime_sec 11526.713 dataflow_v2_java11_total_bytes_count 3.00254704E10 Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED. at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:51) at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:139) at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62) at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157) > Task :sdks:java:testing:load-tests:run FAILED > Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220210124333 Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d70eb28f103e3451da3491cfddc45351966ba703159dbc8e38f8a3e6a9e05426 Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220210124333] - referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d70eb28f103e3451da3491cfddc45351966ba703159dbc8e38f8a3e6a9e05426] Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220210124333] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d70eb28f103e3451da3491cfddc45351966ba703159dbc8e38f8a3e6a9e05426])]. Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d70eb28f103e3451da3491cfddc45351966ba703159dbc8e38f8a3e6a9e05426 Digests: - us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d70eb28f103e3451da3491cfddc45351966ba703159dbc8e38f8a3e6a9e05426 Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d70eb28f103e3451da3491cfddc45351966ba703159dbc8e38f8a3e6a9e05426]. Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9ad812c02dd245785430c732fa9769e7eaac5982fbeb695b53b4928449dfd98f Digests: - us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9ad812c02dd245785430c732fa9769e7eaac5982fbeb695b53b4928449dfd98f ERROR: (gcloud.container.images.delete) Not found: response: {'docker-distribution-api-version': 'registry/2.0', 'content-type': 'application/json', 'date': 'Thu, 10 Feb 2022 16:03:13 GMT', 'server': 'Docker Registry', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'transfer-encoding': 'chunked', 'status': '404', 'content-length': '168', '-content-encoding': 'gzip'} Failed to compute blob liveness for manifest: 'sha256:9ad812c02dd245785430c732fa9769e7eaac5982fbeb695b53b4928449dfd98f': None > Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages FAILED FAILURE: Build completed with 2 failures. 1: Task failed with an exception. ----------- * What went wrong: Execution failed for task ':sdks:java:testing:load-tests:run'. > Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with > non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. ============================================================================== 2: Task failed with an exception. ----------- * Where: Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 297 * What went wrong: Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'. > Process 'command './scripts/cleanup_untagged_gcr_images.sh'' finished with > non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. ============================================================================== * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0. You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins. See https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness. Please consult deprecation warnings for more details. BUILD FAILED in 3h 19m 53s 109 actionable tasks: 74 executed, 31 from cache, 4 up-to-date Publishing build scan... https://gradle.com/s/2ogl5mpme33i4 Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
