See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_Streaming/553/display/redirect?page=changes>
Changes: [heejong] [BEAM-11519] Adding PYPI_INSTALL_REQ role and SDK container support [tomasz.szerszen] Create Spark Metrics in directory using Spark History Server format [tomasz.szerszen] java spotless apply [tomasz.szerszen] add --spark-history-dir option in spark job server [tomasz.szerszen] add driver distribution logs [tomasz.szerszen] remove start import check [tomasz.szerszen] spotless apply [heejong] move idCounter variable and add comments [heejong] simplify the branches, change _req to _requirement [tomasz.szerszen] render all metrics & add eventLogEnabled [tomasz.szerszen] add exception when eventLogEnabled is true [tomasz.szerszen] handle eventLogEnabled [tomasz.szerszen] apply spotless [tomasz.szerszen] remove star import [tomasz.szerszen] run java spotless [tomasz.szerszen] fix renderName is static from now on [sychen] Add transform translator for GroupIntoBatches in Java. [tomasz.szerszen] spark.executor.id loop and switch to boolean [tomasz.szerszen] spotless apply [tomasz.szerszen] remove uncessary code [samuelw] [BEAM-11657] Avoid repeated reflection calls for Kafka deserialization [Pablo Estrada] [BEAM-11705] Fixing ignore_insert_id implementation [tomasz.szerszen] scope.Option.apply [tomasz.szerszen] remove options from spark job server configuration & fix the spark [tomasz.szerszen] remove options from spark job server configuration & fix the spark [tomasz.szerszen] remove options from spark job server configuration & fix the spark [tomasz.szerszen] remove options from spark job server configuration & fix the spark [tomasz.szerszen] remove options from spark job server configuration & fix the spark [tomasz.szerszen] remove options from spark job server configuration & fix the spark [sychen] Fix checkStyle error; add a condition to disable autosharding for JRH [sychen] Fail jobs that would otherwise fall back to the default implementation [sychen] Fix failed test; update checkArgument logs. [tomasz.szerszen.poczta] minor cleanup [tomasz.szerszen.poczta] minor cleanup [noreply] [BEAM-11531] Allow pandas <1.3.0 (#13681) [tomasz.szerszen.poczta] minor improvments [tomasz.szerszen.poczta] remove whitespace changes [tomasz.szerszen.poczta] remove whitespace changes [tysonjh] Bump Dataflow worker container version. [noreply] [BEAM-11476] Resolve flaky tests (#13881) [heejong] change urn for pip install requirements file ------------------------------------------ [...truncated 26.62 KB...] > Task :sdks:java:core:compileJava FROM-CACHE > Task :sdks:java:core:classes > Task :sdks:java:core:shadowJar FROM-CACHE > Task :sdks:java:extensions:protobuf:extractIncludeProto > Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE > Task :sdks:java:io:synthetic:compileJava FROM-CACHE > Task :sdks:java:io:synthetic:classes UP-TO-DATE > Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE > Task :sdks:java:extensions:protobuf:classes UP-TO-DATE > Task :sdks:java:io:synthetic:jar > Task :sdks:java:testing:test-utils:compileJava FROM-CACHE > Task :sdks:java:testing:test-utils:classes UP-TO-DATE > Task :sdks:java:io:kinesis:compileJava FROM-CACHE > Task :sdks:java:io:kinesis:classes UP-TO-DATE > Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE > Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE > Task :sdks:java:testing:test-utils:jar > Task :sdks:java:extensions:protobuf:jar > Task :sdks:java:io:kinesis:jar > Task :sdks:java:fn-execution:compileJava FROM-CACHE > Task :sdks:java:fn-execution:classes UP-TO-DATE > Task :sdks:java:fn-execution:jar > Task :vendor:sdks-java-extensions-protobuf:shadowJar FROM-CACHE > Task :runners:core-construction-java:compileJava FROM-CACHE > Task :runners:core-construction-java:classes UP-TO-DATE > Task :runners:core-construction-java:jar > Task :runners:core-java:compileJava FROM-CACHE > Task :runners:core-java:classes UP-TO-DATE > Task :runners:core-java:jar > Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE > Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:jar > Task :sdks:java:harness:compileJava FROM-CACHE > Task :sdks:java:harness:classes UP-TO-DATE > Task :sdks:java:harness:jar > Task :sdks:java:harness:shadowJar FROM-CACHE > Task :runners:java-fn-execution:compileJava FROM-CACHE > Task :runners:java-fn-execution:classes UP-TO-DATE > Task :runners:java-fn-execution:jar > Task :sdks:java:expansion-service:compileJava FROM-CACHE > Task :sdks:java:expansion-service:classes UP-TO-DATE > Task :sdks:java:expansion-service:jar > Task :sdks:java:io:kafka:compileJava FROM-CACHE > Task :sdks:java:io:kafka:classes UP-TO-DATE > Task :sdks:java:io:kafka:jar > Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE > Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE > Task :sdks:java:io:google-cloud-platform:jar > Task :sdks:java:testing:load-tests:compileJava FROM-CACHE > Task :sdks:java:testing:load-tests:classes UP-TO-DATE > Task :sdks:java:testing:load-tests:jar > Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE > Task :runners:google-cloud-dataflow-java:classes > Task :runners:google-cloud-dataflow-java:jar > Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava > FROM-CACHE > Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE > Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar FROM-CACHE > Task :sdks:java:testing:load-tests:run Feb 04, 2021 12:22:54 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create INFO: No stagingLocation provided, falling back to gcpTempLocation Feb 04, 2021 12:22:54 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 187 files. Enable logging at DEBUG level to see which files will be staged. Feb 04, 2021 12:22:55 PM org.apache.beam.sdk.Pipeline validate WARNING: The following transforms do not have stable unique names: Window.Into() Feb 04, 2021 12:22:55 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. Feb 04, 2021 12:22:57 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Uploading 188 files from PipelineOptions.filesToStage to staging location to prepare for execution. Feb 04, 2021 12:22:57 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage INFO: Staging custom dataflow-****.jar as beam-runners-google-cloud-dataflow-java-legacy-****-2.29.0-SNAPSHOT-A6m4bxgyKMMyKxqw0gkVdW8WGw4nJLEEWxooQpL0-zk.jar Feb 04, 2021 12:22:58 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 188 files cached, 0 files newly uploaded in 0 seconds Feb 04, 2021 12:22:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1 Feb 04, 2021 12:22:58 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46185a1b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@51288417, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60cf62ad, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e0895f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1ac4ccad, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fd9ebde, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@14982a82, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ee5b2d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@72f8ae0c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@323f3c96, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6726cc69, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b6d92e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@33899f7a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7899de11, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@290d10ef, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1bc0d349, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@644ded04, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5292ceca, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@13d9261f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@e9ef5b6] Feb 04, 2021 12:22:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/StripIds as step s2 Feb 04, 2021 12:22:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect start time metrics (input) as step s3 Feb 04, 2021 12:22:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Window.Into()/Window.Assign as step s4 Feb 04, 2021 12:22:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5 Feb 04, 2021 12:22:58 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@54e2fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@70972170, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@119aa36, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e1a46fb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@69fe0ed4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@20ab3e3a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6caf7803, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@709ed6f3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@698fee9a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@102c577f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d44a19, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1fb2d5e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1716e8c5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6573d2f7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4052c8c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@181b8c4b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@38eb0f4d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@437486cd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15b642b9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@518bfd90] Feb 04, 2021 12:22:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read co-input/StripIds as step s6 Feb 04, 2021 12:22:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect start time metrics (co-input) as step s7 Feb 04, 2021 12:22:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Window.Into()2/Window.Assign as step s8 Feb 04, 2021 12:22:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9 Feb 04, 2021 12:22:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10 Feb 04, 2021 12:22:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/Flatten as step s11 Feb 04, 2021 12:22:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/GBK as step s12 Feb 04, 2021 12:22:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13 Feb 04, 2021 12:22:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Ungroup and reiterate as step s14 Feb 04, 2021 12:22:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect total bytes as step s15 Feb 04, 2021 12:22:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect end time metrics as step s16 Feb 04, 2021 12:22:58 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/ Feb 04, 2021 12:22:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <94483 bytes, hash fbbaa59a6a1c7fb5964ae20a274494333116e7962577d33e9efb6639d4af7b3a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline--7qlmmocf7WWSuIKJ0SUMzEW55Yld9M-nvtmOdSvezo.pb Feb 04, 2021 12:22:59 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Dataflow SDK version: 2.29.0-SNAPSHOT Feb 04, 2021 12:23:00 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-04_04_22_59-12294497740068693448?project=apache-beam-testing Feb 04, 2021 12:23:00 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Submitted job: 2021-02-04_04_22_59-12294497740068693448 Feb 04, 2021 12:23:00 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel > --region=us-central1 2021-02-04_04_22_59-12294497740068693448 Feb 04, 2021 12:23:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2021-02-04T12:23:03.387Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java0dataflow0streaming0cogbk01-jenkins-0204122-muze. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions Feb 04, 2021 12:23:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:07.437Z: Worker configuration: n1-standard-4 in us-central1-f. Feb 04, 2021 12:23:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:08.122Z: Expanding CoGroupByKey operations into optimizable parts. Feb 04, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:08.192Z: Expanding SplittableProcessKeyed operations into optimizable parts. Feb 04, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:08.227Z: Expanding GroupByKey operations into streaming Read/Write steps Feb 04, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:08.298Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns Feb 04, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:08.420Z: Fusing adjacent ParDo, Read, Write, and Flatten operations Feb 04, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:08.460Z: Unzipping flatten s11 for input s10.org.apache.beam.sdk.values.PCollection.<init>:402#5f2ef1f005ae0b4 Feb 04, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:08.496Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable1 Feb 04, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:08.520Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable0 Feb 04, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:08.561Z: Fusing consumer Read input/StripIds into Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds Feb 04, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:08.594Z: Fusing consumer Read co-input/StripIds into Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds Feb 04, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:08.614Z: Fusing consumer Collect start time metrics (co-input) into Read co-input/StripIds Feb 04, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:08.647Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input) Feb 04, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:08.679Z: Fusing consumer CoGroupByKey/MakeUnionTable1 into Window.Into()2/Window.Assign Feb 04, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:08.710Z: Fusing consumer Collect start time metrics (input) into Read input/StripIds Feb 04, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:08.733Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input) Feb 04, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:08.766Z: Fusing consumer CoGroupByKey/MakeUnionTable0 into Window.Into()/Window.Assign Feb 04, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:08.799Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream Feb 04, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:08.847Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn into CoGroupByKey/GBK/MergeBuckets Feb 04, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:08.881Z: Fusing consumer Ungroup and reiterate into CoGroupByKey/ConstructCoGbkResultFn Feb 04, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:08.899Z: Fusing consumer Collect total bytes into Ungroup and reiterate Feb 04, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:08.931Z: Fusing consumer Collect end time metrics into Collect total bytes Feb 04, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:09.380Z: Executing operation CoGroupByKey/GBK/ReadStream+CoGroupByKey/GBK/MergeBuckets+CoGroupByKey/ConstructCoGbkResultFn+Ungroup and reiterate+Collect total bytes+Collect end time metrics Feb 04, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:09.416Z: Executing operation Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read co-input/StripIds+Collect start time metrics (co-input)+Window.Into()2/Window.Assign+CoGroupByKey/MakeUnionTable1+CoGroupByKey/GBK/WriteStream Feb 04, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:09.455Z: Executing operation Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read input/StripIds+Collect start time metrics (input)+Window.Into()/Window.Assign+CoGroupByKey/MakeUnionTable0+CoGroupByKey/GBK/WriteStream Feb 04, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:09.461Z: Starting 5 ****s in us-central1-f... Feb 04, 2021 12:23:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:36.845Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Feb 04, 2021 12:23:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:23:49.533Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. Feb 04, 2021 12:24:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:24:13.298Z: Workers have started successfully. Feb 04, 2021 12:24:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T12:24:13.333Z: Workers have started successfully. Feb 04, 2021 1:57:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T13:57:21.881Z: Workers have started successfully. Feb 04, 2021 1:57:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T13:57:21.996Z: Executing operation CoGroupByKey/GBK/ReadStream+CoGroupByKey/GBK/MergeBuckets+CoGroupByKey/ConstructCoGbkResultFn+Ungroup and reiterate+Collect total bytes+Collect end time metrics Feb 04, 2021 1:57:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T13:57:21.998Z: Executing operation Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read co-input/StripIds+Collect start time metrics (co-input)+Window.Into()2/Window.Assign+CoGroupByKey/MakeUnionTable1+CoGroupByKey/GBK/WriteStream Feb 04, 2021 1:57:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T13:57:22.000Z: Executing operation Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read input/StripIds+Collect start time metrics (input)+Window.Into()/Window.Assign+CoGroupByKey/MakeUnionTable0+CoGroupByKey/GBK/WriteStream Feb 04, 2021 1:57:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T13:57:25.818Z: Worker configuration: n1-standard-4 in us-central1-f. Feb 04, 2021 1:57:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T13:57:38.297Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Feb 04, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T16:00:29.050Z: Cancel request is committed for workflow job: 2021-02-04_04_22_59-12294497740068693448. Feb 04, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T16:00:29.239Z: Finished operation Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read co-input/StripIds+Collect start time metrics (co-input)+Window.Into()2/Window.Assign+CoGroupByKey/MakeUnionTable1+CoGroupByKey/GBK/WriteStream Feb 04, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T16:00:29.239Z: Finished operation CoGroupByKey/GBK/ReadStream+CoGroupByKey/GBK/MergeBuckets+CoGroupByKey/ConstructCoGbkResultFn+Ungroup and reiterate+Collect total bytes+Collect end time metrics Feb 04, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T16:00:29.239Z: Finished operation Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read input/StripIds+Collect start time metrics (input)+Window.Into()/Window.Assign+CoGroupByKey/MakeUnionTable0+CoGroupByKey/GBK/WriteStream Feb 04, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T16:00:29.428Z: Cleaning up. Feb 04, 2021 4:00:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T16:00:29.497Z: Stopping **** pool... Feb 04, 2021 4:01:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T16:01:25.957Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate. Feb 04, 2021 4:01:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-04T16:01:26.004Z: Worker pool stopped. Feb 04, 2021 4:01:47 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState INFO: Job 2021-02-04_04_22_59-12294497740068693448 finished with status CANCELLED. Load test results for test (ID): 261fe6b8-6dc2-4cf2-a0f9-527958d20de7 and timestamp: 2021-02-04T12:22:54.819000000Z: Metric: Value: dataflow_runtime_sec 12866.082 dataflow_total_bytes_count 1.8507823E9 Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED. at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55) at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:137) at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62) at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157) > Task :sdks:java:testing:load-tests:run FAILED FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:testing:load-tests:run'. > Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with > non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 3h 39m 28s 90 actionable tasks: 55 executed, 35 from cache Publishing build scan... https://gradle.com/s/towuaar4mes3a Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
