See 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_Streaming/538/display/redirect?page=changes>

Changes:

[sonam.ramchand] Enabled strict dependency on Local Java

[heejong] [BEAM-11032] Use metric for Java BigQuery streaming insert API latency

[heejong] fix checkstyle error, rename equalsMetricName

[noreply] Fix broken link to signature for 2.21.0

[sonam.ramchand] set  enableStrictDependencies: true

[noreply] [BEAM-9615] Add Schema Logical Type Provider support (#13760)

[noreply] Remove redundant & detatched package comment

[dhuntsperger] BEAM-10095: Add Runner and SDK links to Beam overview page

[noreply] [BEAM-9615] Disable schema registration.

[noreply] [BEAM-10961] Enabled strict dependency on Spark (#13668)

[noreply] [BEAM-11461] update tox and simplify tox.ini (#13692)


------------------------------------------
[...truncated 201.94 KB...]
> Task :model:pipeline:processResources UP-TO-DATE
> Task :model:pipeline:classes UP-TO-DATE
> Task :model:pipeline:jar UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:windmill:shadowJar UP-TO-DATE
> Task :model:pipeline:shadowJar UP-TO-DATE
> Task :model:job-management:extractIncludeProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:fn-execution:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar UP-TO-DATE
> Task :sdks:java:io:synthetic:compileJava UP-TO-DATE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :sdks:java:io:synthetic:jar UP-TO-DATE
> Task :sdks:java:io:kinesis:compileJava UP-TO-DATE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :runners:local-java:compileJava UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :runners:local-java:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:jar UP-TO-DATE
> Task :runners:local-java:jar UP-TO-DATE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar UP-TO-DATE
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :sdks:java:testing:test-utils:compileJava UP-TO-DATE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar UP-TO-DATE
> Task :runners:java-fn-execution:compileJava UP-TO-DATE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar UP-TO-DATE
> Task :sdks:java:expansion-service:compileJava UP-TO-DATE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :runners:direct-java:compileJava UP-TO-DATE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar UP-TO-DATE
> Task :runners:direct-java:shadowJar UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:compileJava UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:jar UP-TO-DATE
> Task :sdks:java:testing:load-tests:compileJava UP-TO-DATE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava 
> UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar UP-TO-DATE

> Task :sdks:java:testing:load-tests:run
Jan 20, 2021 1:06:24 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jan 20, 2021 1:06:25 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from 
the classpath: will stage 189 files. Enable logging at DEBUG level to see which 
files will be staged.
Jan 20, 2021 1:06:26 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Jan 20, 2021 1:06:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
Jan 20, 2021 1:06:30 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Uploading 190 files from PipelineOptions.filesToStage to staging location 
to prepare for execution.
Jan 20, 2021 1:06:30 PM 
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes 
forFileToStage
INFO: Staging custom dataflow-****.jar as 
beam-runners-google-cloud-dataflow-java-legacy-****-2.28.0-SNAPSHOT-Nr_hlylUEO8TyRrBomNEozbSGC2dSRiCH95xyEXjVNA.jar
Jan 20, 2021 1:06:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 190 files cached, 0 files newly uploaded in 0 
seconds
Jan 20, 2021 1:06:31 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as 
step s1
Jan 20, 2021 1:06:31 PM 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: 
[org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5292ceca, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@13d9261f, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@e9ef5b6, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5300cac, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4110765e, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1ba359bd, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@62e93c3a, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@673919a7, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@25d93198, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2436ea2f, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f951a7f, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@20cece0b, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c777e7b, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f038248, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@78e22d35, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2e8a1ab4, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59f93db8, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1aabf50d, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@73c9e8e8, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@de8039f]
Jan 20, 2021 1:06:31 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jan 20, 2021 1:06:31 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jan 20, 2021 1:06:31 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jan 20, 2021 1:06:31 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as 
step s5
Jan 20, 2021 1:06:31 PM 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: 
[org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@437486cd, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15b642b9, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@518bfd90, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@317a118b, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56dfab87, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@715b886f, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb29ca9, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1e253c9d, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@350d3f4d, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@18b8d173, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@73844119, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44f24a20, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1859e2a4, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46349b95, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@176996c3, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@411c6d44, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1687eb01, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@748d2277, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f897dab, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@d5d5353]
Jan 20, 2021 1:06:31 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jan 20, 2021 1:06:31 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jan 20, 2021 1:06:31 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jan 20, 2021 1:06:31 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jan 20, 2021 1:06:31 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jan 20, 2021 1:06:31 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jan 20, 2021 1:06:31 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jan 20, 2021 1:06:31 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jan 20, 2021 1:06:31 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jan 20, 2021 1:06:31 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jan 20, 2021 1:06:31 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jan 20, 2021 1:06:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to 
gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 20, 2021 1:06:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <95190 bytes, hash 
c688db8db26914bbfab60edbb051ca828c6ee4c7ddc7d367293271b25d9f8553> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-xojbjbJpFLv6tg7bsFHKgoxu5Mfdx9NnKTJxsl2fhVM.pb
Jan 20, 2021 1:06:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.28.0-SNAPSHOT
Jan 20, 2021 1:06:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-20_05_06_31-15181533917531264609?project=apache-beam-testing
Jan 20, 2021 1:06:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-01-20_05_06_31-15181533917531264609
Jan 20, 2021 1:06:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
> --region=us-central1 2021-01-20_05_06_31-15181533917531264609
Jan 20, 2021 1:06:38 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-01-20T13:06:35.937Z: The workflow name is not a valid Cloud 
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring 
will be labeled with this modified job name: 
load0tests0java0dataflow0streaming0cogbk04-jenkins-0120130-e9jq. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jan 20, 2021 1:06:39 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:39.161Z: Worker configuration: n1-standard-4 in 
us-central1-f.
Jan 20, 2021 1:06:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:39.759Z: Expanding CoGroupByKey operations into 
optimizable parts.
Jan 20, 2021 1:06:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:39.770Z: Expanding SplittableProcessKeyed operations 
into optimizable parts.
Jan 20, 2021 1:06:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:39.772Z: Expanding GroupByKey operations into streaming 
Read/Write steps
Jan 20, 2021 1:06:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:39.776Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
Jan 20, 2021 1:06:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:39.793Z: Fusing adjacent ParDo, Read, Write, and Flatten 
operations
Jan 20, 2021 1:06:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:39.796Z: Unzipping flatten s11 for input 
s10.org.apache.beam.sdk.values.PCollection.<init>:402#5f2ef1f005ae0b4
Jan 20, 2021 1:06:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:39.798Z: Fusing unzipped copy of 
CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into 
producer CoGroupByKey/MakeUnionTable1
Jan 20, 2021 1:06:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:39.800Z: Fusing consumer CoGroupByKey/GBK/WriteStream 
into CoGroupByKey/MakeUnionTable0
Jan 20, 2021 1:06:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:39.802Z: Fusing consumer Read input/StripIds into Read 
input/DataflowRunner.StreamingUnboundedRead.ReadWithIds
Jan 20, 2021 1:06:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:39.804Z: Fusing consumer Read co-input/StripIds into 
Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds
Jan 20, 2021 1:06:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:39.807Z: Fusing consumer Collect start time metrics 
(co-input) into Read co-input/StripIds
Jan 20, 2021 1:06:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:39.809Z: Fusing consumer Window.Into()2/Window.Assign 
into Collect start time metrics (co-input)
Jan 20, 2021 1:06:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:39.811Z: Fusing consumer CoGroupByKey/MakeUnionTable1 
into Window.Into()2/Window.Assign
Jan 20, 2021 1:06:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:39.812Z: Fusing consumer Collect start time metrics 
(input) into Read input/StripIds
Jan 20, 2021 1:06:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:39.814Z: Fusing consumer Window.Into()/Window.Assign 
into Collect start time metrics (input)
Jan 20, 2021 1:06:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:39.816Z: Fusing consumer CoGroupByKey/MakeUnionTable0 
into Window.Into()/Window.Assign
Jan 20, 2021 1:06:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:39.819Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets 
into CoGroupByKey/GBK/ReadStream
Jan 20, 2021 1:06:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:39.821Z: Fusing consumer 
CoGroupByKey/ConstructCoGbkResultFn into CoGroupByKey/GBK/MergeBuckets
Jan 20, 2021 1:06:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:39.823Z: Fusing consumer Ungroup and reiterate into 
CoGroupByKey/ConstructCoGbkResultFn
Jan 20, 2021 1:06:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:39.826Z: Fusing consumer Collect total bytes into 
Ungroup and reiterate
Jan 20, 2021 1:06:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:39.828Z: Fusing consumer Collect end time metrics into 
Collect total bytes
Jan 20, 2021 1:06:43 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:40.051Z: Starting 5 ****s...
Jan 20, 2021 1:06:43 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:42.428Z: Executing operation Read 
input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read 
input/StripIds+Collect start time metrics 
(input)+Window.Into()/Window.Assign+CoGroupByKey/MakeUnionTable0+CoGroupByKey/GBK/WriteStream
Jan 20, 2021 1:06:43 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:42.428Z: Executing operation Read 
co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read 
co-input/StripIds+Collect start time metrics 
(co-input)+Window.Into()2/Window.Assign+CoGroupByKey/MakeUnionTable1+CoGroupByKey/GBK/WriteStream
Jan 20, 2021 1:06:43 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:42.428Z: Executing operation 
CoGroupByKey/GBK/ReadStream+CoGroupByKey/GBK/MergeBuckets+CoGroupByKey/ConstructCoGbkResultFn+Ungroup
 and reiterate+Collect total bytes+Collect end time metrics
Jan 20, 2021 1:07:10 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:07:09.129Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 20, 2021 1:07:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:07:19.259Z: Worker configuration: n1-standard-4 in 
us-central1-f.
Jan 20, 2021 1:07:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:07:31.785Z: Workers have started successfully.
Jan 20, 2021 1:09:25 PM org.apache.beam.runners.dataflow.DataflowPipelineJob 
lambda$waitUntilFinish$0
WARNING: Job is already running in Google Cloud Platform, Ctrl-C will not 
cancel it.
To cancel the job in the cloud, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
> --region=us-central1 2021-01-20_05_06_31-15181533917531264609

> Task :sdks:java:testing:load-tests:run FAILED
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=524ab69c-20c2-4d36-a435-bafb0a792499, 
currentDir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_Streaming/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 30554
  log file: /home/jenkins/.gradle/daemon/6.8/daemon-30554.out.log
----- Last  20 lines from daemon log file - daemon-30554.out.log -----
INFO: 2021-01-20T13:06:39.828Z: Fusing consumer Collect end time metrics into 
Collect total bytes
Jan 20, 2021 1:06:43 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:40.051Z: Starting 5 ****s...
Jan 20, 2021 1:06:43 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:42.428Z: Executing operation Read 
input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read 
input/StripIds+Collect start time metrics 
(input)+Window.Into()/Window.Assign+CoGroupByKey/MakeUnionTable0+CoGroupByKey/GBK/WriteStream
Jan 20, 2021 1:06:43 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:42.428Z: Executing operation Read 
co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read 
co-input/StripIds+Collect start time metrics 
(co-input)+Window.Into()2/Window.Assign+CoGroupByKey/MakeUnionTable1+CoGroupByKey/GBK/WriteStream
Jan 20, 2021 1:06:43 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:06:42.428Z: Executing operation 
CoGroupByKey/GBK/ReadStream+CoGroupByKey/GBK/MergeBuckets+CoGroupByKey/ConstructCoGbkResultFn+Ungroup
 and reiterate+Collect total bytes+Collect end time metrics
Jan 20, 2021 1:07:10 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:07:09.129Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 20, 2021 1:07:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:07:19.259Z: Worker configuration: n1-standard-4 in 
us-central1-f.
Jan 20, 2021 1:07:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-20T13:07:31.785Z: Workers have started successfully.
Jan 20, 2021 1:09:25 PM org.apache.beam.runners.dataflow.DataflowPipelineJob 
lambda$waitUntilFinish$0
WARNING: Job is already running in Google Cloud Platform, Ctrl-C will not 
cancel it.
To cancel the job in the cloud, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
> --region=us-central1 2021-01-20_05_06_31-15181533917531264609
Daemon vm is shutting down... The daemon has exited normally or was terminated 
in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may 
have crashed)

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to