See 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_Streaming/551/display/redirect?page=changes>

Changes:

[zyichi] Setup InfluxDbIO_IT jenkins job cron

[randomstep] [BEAM-11652] bump aircompressor to 0.1.8

[Kyle Weaver] [BEAM-10379] Remove BIT_XOR from ZetaSQL supported functions list.

[Kyle Weaver] [BEAM-11732] Revert flink-clients from runtime to compile 
configuration.

[noreply] [BEAM-11731] Restrict to numpy <1.20.0 (#13870)

[noreply] [BEAM-11357] Copy Annotations when cloning PTransforms (#13865)

[noreply] [BEAM-11693]  Update formatting. Fix email template (#13815)


------------------------------------------
[...truncated 24.71 KB...]
> Task :model:job-management:generateProto FROM-CACHE
> Task :model:fn-execution:generateProto FROM-CACHE
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:job-management:shadowJar FROM-CACHE
> Task :model:fn-execution:shadowJar FROM-CACHE
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar FROM-CACHE
> Task :sdks:java:fn-execution:jar
> Task :sdks:java:io:kinesis:jar
> Task :sdks:java:extensions:protobuf:jar
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:core-construction-java:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar FROM-CACHE
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :sdks:java:testing:load-tests:compileJava FROM-CACHE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:classes
> Task :sdks:java:testing:load-tests:jar
> Task :runners:google-cloud-dataflow-java:jar
> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava 
> FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar FROM-CACHE

> Task :sdks:java:testing:load-tests:run
Feb 02, 2021 12:23:44 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Feb 02, 2021 12:23:45 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from 
the classpath: will stage 187 files. Enable logging at DEBUG level to see which 
files will be staged.
Feb 02, 2021 12:23:46 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Feb 02, 2021 12:23:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
Feb 02, 2021 12:23:52 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Uploading 188 files from PipelineOptions.filesToStage to staging location 
to prepare for execution.
Feb 02, 2021 12:23:53 PM 
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes 
forFileToStage
INFO: Staging custom dataflow-****.jar as 
beam-runners-google-cloud-dataflow-java-legacy-****-2.29.0-SNAPSHOT-hcrHl3nIcMxtK6uIC61YfEhyUatJnS8yaBhg51-cb3k.jar
Feb 02, 2021 12:23:55 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 188 files cached, 0 files newly uploaded in 3 
seconds
Feb 02, 2021 12:23:56 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as 
step s1
Feb 02, 2021 12:23:56 PM 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: 
[org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a865273, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@288ca5f0, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4068102e, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44bd4b0a, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6c008c24, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@216e0771, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@21079a12, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fcc6023, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67c5ac52, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@36417a54, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b8bb184, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@472a11ae, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dc79225, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@30e9ca13, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46185a1b, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@51288417, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60cf62ad, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e0895f5, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1ac4ccad, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fd9ebde]
Feb 02, 2021 12:23:56 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Feb 02, 2021 12:23:56 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Feb 02, 2021 12:23:56 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Feb 02, 2021 12:23:56 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as 
step s5
Feb 02, 2021 12:23:56 PM 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: 
[org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e7c141d, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43af351a, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1305c126, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@72f9f27c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c1bdcc2, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@762637be, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b97c4ad, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7640a5b1, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@642f9a77, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@23f3da8b, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5634d0f4, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@252a8aae, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3d4e405e, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@54e2fe, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@70972170, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@119aa36, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e1a46fb, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@69fe0ed4, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@20ab3e3a, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6caf7803]
Feb 02, 2021 12:23:56 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Feb 02, 2021 12:23:56 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Feb 02, 2021 12:23:56 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Feb 02, 2021 12:23:56 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Feb 02, 2021 12:23:56 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Feb 02, 2021 12:23:56 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Feb 02, 2021 12:23:56 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Feb 02, 2021 12:23:56 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Feb 02, 2021 12:23:56 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Feb 02, 2021 12:23:56 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Feb 02, 2021 12:23:56 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Feb 02, 2021 12:23:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to 
gs://temp-storage-for-perf-tests/loadtests/staging/
Feb 02, 2021 12:23:56 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <94483 bytes, hash 
6dffa838cf75788ffb79ca030d5da82640bc212e5bd459b805759b0c2f0f1401> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-bf-oOM91eI_7ecoDDV2oJkC8IS5b1Fm4BXWbDC8PFAE.pb
Feb 02, 2021 12:23:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.29.0-SNAPSHOT
Feb 02, 2021 12:23:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-02_04_23_57-10322241500736390368?project=apache-beam-testing
Feb 02, 2021 12:23:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-02-02_04_23_57-10322241500736390368
Feb 02, 2021 12:23:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
> --region=us-central1 2021-02-02_04_23_57-10322241500736390368
Feb 02, 2021 12:24:02 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-02-02T12:24:00.984Z: The workflow name is not a valid Cloud 
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring 
will be labeled with this modified job name: 
load0tests0java0dataflow0streaming0cogbk01-jenkins-0202122-64oy. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
Feb 02, 2021 12:24:04 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:04.056Z: Worker configuration: n1-standard-4 in 
us-central1-f.
Feb 02, 2021 12:24:05 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:04.878Z: Expanding CoGroupByKey operations into 
optimizable parts.
Feb 02, 2021 12:24:05 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:04.972Z: Expanding SplittableProcessKeyed operations 
into optimizable parts.
Feb 02, 2021 12:24:05 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:05.014Z: Expanding GroupByKey operations into streaming 
Read/Write steps
Feb 02, 2021 12:24:05 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:05.097Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
Feb 02, 2021 12:24:05 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:05.276Z: Fusing adjacent ParDo, Read, Write, and Flatten 
operations
Feb 02, 2021 12:24:05 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:05.311Z: Unzipping flatten s11 for input 
s10.org.apache.beam.sdk.values.PCollection.<init>:402#5f2ef1f005ae0b4
Feb 02, 2021 12:24:05 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:05.356Z: Fusing unzipped copy of 
CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into 
producer CoGroupByKey/MakeUnionTable1
Feb 02, 2021 12:24:05 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:05.391Z: Fusing consumer CoGroupByKey/GBK/WriteStream 
into CoGroupByKey/MakeUnionTable0
Feb 02, 2021 12:24:05 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:05.430Z: Fusing consumer Read input/StripIds into Read 
input/DataflowRunner.StreamingUnboundedRead.ReadWithIds
Feb 02, 2021 12:24:05 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:05.467Z: Fusing consumer Read co-input/StripIds into 
Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds
Feb 02, 2021 12:24:05 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:05.499Z: Fusing consumer Collect start time metrics 
(co-input) into Read co-input/StripIds
Feb 02, 2021 12:24:05 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:05.534Z: Fusing consumer Window.Into()2/Window.Assign 
into Collect start time metrics (co-input)
Feb 02, 2021 12:24:05 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:05.565Z: Fusing consumer CoGroupByKey/MakeUnionTable1 
into Window.Into()2/Window.Assign
Feb 02, 2021 12:24:05 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:05.605Z: Fusing consumer Collect start time metrics 
(input) into Read input/StripIds
Feb 02, 2021 12:24:05 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:05.646Z: Fusing consumer Window.Into()/Window.Assign 
into Collect start time metrics (input)
Feb 02, 2021 12:24:05 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:05.693Z: Fusing consumer CoGroupByKey/MakeUnionTable0 
into Window.Into()/Window.Assign
Feb 02, 2021 12:24:05 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:05.734Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets 
into CoGroupByKey/GBK/ReadStream
Feb 02, 2021 12:24:05 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:05.776Z: Fusing consumer 
CoGroupByKey/ConstructCoGbkResultFn into CoGroupByKey/GBK/MergeBuckets
Feb 02, 2021 12:24:05 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:05.816Z: Fusing consumer Ungroup and reiterate into 
CoGroupByKey/ConstructCoGbkResultFn
Feb 02, 2021 12:24:05 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:05.860Z: Fusing consumer Collect total bytes into 
Ungroup and reiterate
Feb 02, 2021 12:24:08 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:05.903Z: Fusing consumer Collect end time metrics into 
Collect total bytes
Feb 02, 2021 12:24:08 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:06.323Z: Executing operation 
CoGroupByKey/GBK/ReadStream+CoGroupByKey/GBK/MergeBuckets+CoGroupByKey/ConstructCoGbkResultFn+Ungroup
 and reiterate+Collect total bytes+Collect end time metrics
Feb 02, 2021 12:24:08 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:06.356Z: Executing operation Read 
co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read 
co-input/StripIds+Collect start time metrics 
(co-input)+Window.Into()2/Window.Assign+CoGroupByKey/MakeUnionTable1+CoGroupByKey/GBK/WriteStream
Feb 02, 2021 12:24:08 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:06.401Z: Executing operation Read 
input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read 
input/StripIds+Collect start time metrics 
(input)+Window.Into()/Window.Assign+CoGroupByKey/MakeUnionTable0+CoGroupByKey/GBK/WriteStream
Feb 02, 2021 12:24:08 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:06.411Z: Starting 5 ****s in us-central1-f...
Feb 02, 2021 12:24:19 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:18.202Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Feb 02, 2021 12:24:40 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:40.210Z: Autoscaling: Raised the number of ****s to 3 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 02, 2021 12:24:40 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:40.290Z: Resized **** pool to 3, though goal was 5.  
This could be a quota issue.
Feb 02, 2021 12:24:53 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:24:50.796Z: Autoscaling: Raised the number of ****s to 5 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 02, 2021 12:25:15 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:25:13.754Z: Workers have started successfully.
Feb 02, 2021 12:25:15 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T12:25:13.783Z: Workers have started successfully.
Feb 02, 2021 4:00:30 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T16:00:30.118Z: Cancel request is committed for workflow job: 
2021-02-02_04_23_57-10322241500736390368.
Feb 02, 2021 4:00:30 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T16:00:30.145Z: Finished operation 
CoGroupByKey/GBK/ReadStream+CoGroupByKey/GBK/MergeBuckets+CoGroupByKey/ConstructCoGbkResultFn+Ungroup
 and reiterate+Collect total bytes+Collect end time metrics
Feb 02, 2021 4:00:30 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T16:00:30.145Z: Finished operation Read 
co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read 
co-input/StripIds+Collect start time metrics 
(co-input)+Window.Into()2/Window.Assign+CoGroupByKey/MakeUnionTable1+CoGroupByKey/GBK/WriteStream
Feb 02, 2021 4:00:30 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T16:00:30.145Z: Finished operation Read 
input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read 
input/StripIds+Collect start time metrics 
(input)+Window.Into()/Window.Assign+CoGroupByKey/MakeUnionTable0+CoGroupByKey/GBK/WriteStream
Feb 02, 2021 4:00:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T16:00:30.495Z: Cleaning up.
Feb 02, 2021 4:00:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T16:00:30.606Z: Stopping **** pool...
Feb 02, 2021 4:01:23 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T16:01:23.037Z: Autoscaling: Reduced the number of ****s to 0 
based on low average **** CPU utilization, and the pipeline having sufficiently 
low backlog and keeping up with input rate.
Feb 02, 2021 4:01:23 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-02-02T16:01:23.087Z: Worker pool stopped.
Feb 02, 2021 4:01:36 PM org.apache.beam.runners.dataflow.DataflowPipelineJob 
logTerminalState
INFO: Job 2021-02-02_04_23_57-10322241500736390368 finished with status 
CANCELLED.
Load test results for test (ID): 61d06138-ccd6-4f60-9b40-6001596598d2 and 
timestamp: 2021-02-02T12:23:45.952000000Z:
                 Metric:                    Value:
    dataflow_runtime_sec                 12809.473
dataflow_total_bytes_count               1.8435785E9
Exception in thread "main" java.lang.RuntimeException: Invalid job state: 
CANCELLED.
        at 
org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
        at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:137)
        at 
org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
        at 
org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 39m 9s
90 actionable tasks: 55 executed, 35 from cache

Publishing build scan...
https://gradle.com/s/ikul2en57n6sy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to