See <https://builds.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_Streaming/319/display/redirect?page=changes>
Changes: [relax] switch cogbk to use Beam transform [relax] finish join [relax] support side-input joins [relax] support side-input joins [relax] spotless [relax] make FieldAccessDescriptor always be field-insertion order [relax] fix side-input joins [relax] fix bug [relax] remove obsolete test [relax] add javadoc [relax] add unit tests [relax] update sql transform ------------------------------------------ [...truncated 169.00 KB...] > Task :model:job-management:generateProto UP-TO-DATE > Task :model:fn-execution:compileJava UP-TO-DATE > Task :model:fn-execution:classes UP-TO-DATE > Task :model:job-management:compileJava UP-TO-DATE > Task :model:job-management:classes UP-TO-DATE > Task :model:job-management:shadowJar UP-TO-DATE > Task :model:fn-execution:shadowJar UP-TO-DATE > Task :sdks:java:core:compileJava UP-TO-DATE > Task :sdks:java:core:classes UP-TO-DATE > Task :sdks:java:core:shadowJar UP-TO-DATE > Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE > Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE > Task :sdks:java:fn-execution:compileJava UP-TO-DATE > Task :runners:local-java:compileJava UP-TO-DATE > Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE > Task :runners:local-java:classes UP-TO-DATE > Task :sdks:java:fn-execution:classes UP-TO-DATE > Task :sdks:java:io:synthetic:compileJava UP-TO-DATE > Task :sdks:java:io:synthetic:classes UP-TO-DATE > Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE > Task :runners:local-java:jar UP-TO-DATE > Task :runners:core-construction-java:compileJava UP-TO-DATE > Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE > Task :runners:core-construction-java:classes UP-TO-DATE > Task :sdks:java:fn-execution:jar UP-TO-DATE > Task :sdks:java:io:synthetic:jar UP-TO-DATE > Task :runners:core-construction-java:jar UP-TO-DATE > Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE > Task :sdks:java:extensions:protobuf:classes UP-TO-DATE > Task :sdks:java:extensions:protobuf:jar UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE > Task :sdks:java:io:kinesis:compileJava UP-TO-DATE > Task :sdks:java:io:kinesis:classes UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE > Task :sdks:java:io:kinesis:jar UP-TO-DATE > Task :sdks:java:expansion-service:compileJava UP-TO-DATE > Task :sdks:java:expansion-service:classes UP-TO-DATE > Task :sdks:java:expansion-service:jar UP-TO-DATE > Task :runners:core-java:compileJava UP-TO-DATE > Task :runners:core-java:classes UP-TO-DATE > Task :runners:core-java:jar UP-TO-DATE > Task :sdks:java:harness:compileJava UP-TO-DATE > Task :sdks:java:harness:classes UP-TO-DATE > Task :sdks:java:testing:test-utils:compileJava UP-TO-DATE > Task :sdks:java:testing:test-utils:classes UP-TO-DATE > Task :sdks:java:harness:jar UP-TO-DATE > Task :sdks:java:io:kafka:compileJava UP-TO-DATE > Task :sdks:java:io:kafka:classes UP-TO-DATE > Task :sdks:java:testing:test-utils:jar UP-TO-DATE > Task :sdks:java:io:kafka:jar UP-TO-DATE > Task :sdks:java:harness:shadowJar UP-TO-DATE > Task :runners:java-fn-execution:compileJava UP-TO-DATE > Task :runners:java-fn-execution:classes UP-TO-DATE > Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE > Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE > Task :runners:java-fn-execution:jar UP-TO-DATE > Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE > Task :runners:direct-java:compileJava UP-TO-DATE > Task :runners:direct-java:classes UP-TO-DATE > Task :runners:direct-java:shadowJar UP-TO-DATE > Task :runners:google-cloud-dataflow-java:compileJava UP-TO-DATE > Task :runners:google-cloud-dataflow-java:classes UP-TO-DATE > Task :runners:google-cloud-dataflow-java:jar UP-TO-DATE > Task :sdks:java:testing:load-tests:compileJava UP-TO-DATE > Task :sdks:java:testing:load-tests:classes UP-TO-DATE > Task :sdks:java:testing:load-tests:jar UP-TO-DATE > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava > UP-TO-DATE > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:classes > UP-TO-DATE > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar > UP-TO-DATE > Task :sdks:java:testing:load-tests:run Mar 23, 2020 1:06:11 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create INFO: No stagingLocation provided, falling back to gcpTempLocation Mar 23, 2020 1:06:11 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 172 files. Enable logging at DEBUG level to see which files will be staged. Mar 23, 2020 1:06:12 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. Mar 23, 2020 1:06:12 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Uploading 173 files from PipelineOptions.filesToStage to staging location to prepare for execution. Mar 23, 2020 1:06:13 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 173 files cached, 0 files newly uploaded in 0 seconds Mar 23, 2020 1:06:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1 Mar 23, 2020 1:06:13 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 64 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@629f066f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1542af63, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ecfbe91, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@20ed3303, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3adbe50f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a627c80, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49aa766b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@963176, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65004ff6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cafa9aa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@562c877a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67001148, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@989da1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31cb96e1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3eed0f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@64030b91, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2032e725, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4d23015c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@383f1975, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441cc260, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@73a00e09, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26dcd8c0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@66e889df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@444548a0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3766c667, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@773c0293, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@55b8dbda, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3b569985, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a022576, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2dbd803f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3e48e859, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31ddd4a4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a5f7e7c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5b22b970, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@22d1886d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7df60067, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1cbb3d3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@529cfee5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ca0863b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@319854f0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@748fe51d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@415156bf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@393881f0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4af46df3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4158debd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@af78c87, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@773dab28, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1ecfcbc9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1965539b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2fc07784, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@353efdbf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@55cff952, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@660591fb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a55a6e8, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@8c46918, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@226b143b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@682bd3c4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f2e4acf, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@24097e9b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5eb97ced, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@68ba310d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@153f66e7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7aad3f7d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f667ad1] Mar 23, 2020 1:06:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/StripIds as step s2 Mar 23, 2020 1:06:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect start time metrics as step s3 Mar 23, 2020 1:06:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Total bytes monitor as step s4 Mar 23, 2020 1:06:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Window.Into()/Window.Assign as step s5 Mar 23, 2020 1:06:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Group by key (0) as step s6 Mar 23, 2020 1:06:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Ungroup and reiterate (0) as step s7 Mar 23, 2020 1:06:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect end time metrics (0) as step s8 Mar 23, 2020 1:06:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Group by key (1) as step s9 Mar 23, 2020 1:06:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Ungroup and reiterate (1) as step s10 Mar 23, 2020 1:06:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect end time metrics (1) as step s11 Mar 23, 2020 1:06:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Group by key (2) as step s12 Mar 23, 2020 1:06:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Ungroup and reiterate (2) as step s13 Mar 23, 2020 1:06:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect end time metrics (2) as step s14 Mar 23, 2020 1:06:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Group by key (3) as step s15 Mar 23, 2020 1:06:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Ungroup and reiterate (3) as step s16 Mar 23, 2020 1:06:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect end time metrics (3) as step s17 Mar 23, 2020 1:06:13 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/ Mar 23, 2020 1:06:13 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Dataflow SDK version: 2.21.0-SNAPSHOT Mar 23, 2020 1:06:14 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$DefaultGcpRegionFactory create WARNING: Region will default to us-central1. Future releases of Beam will require the user to set the region explicitly. https://cloud.google.com/compute/docs/regions-zones/regions-zones Mar 23, 2020 1:06:15 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-23_06_06_14-8011108516876577625?project=apache-beam-testing Mar 23, 2020 1:06:15 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Submitted job: 2020-03-23_06_06_14-8011108516876577625 Mar 23, 2020 1:06:15 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel > --region=us-central1 2020-03-23_06_06_14-8011108516876577625 Mar 23, 2020 1:06:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2020-03-23T13:06:17.604Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java0dataflow000mode00gbk04-jenkins-0323130611--6tub. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions Mar 23, 2020 1:06:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:17.843Z: Checking permissions granted to controller Service Account. Mar 23, 2020 1:06:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:22.477Z: Worker configuration: n1-standard-4 in us-central1-c. Mar 23, 2020 1:06:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.031Z: Expanding CoGroupByKey operations into optimizable parts. Mar 23, 2020 1:06:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.050Z: Expanding SplittableProcessKeyed operations into optimizable parts. Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.053Z: Expanding GroupByKey operations into streaming Read/Write steps Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.065Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.110Z: Fusing adjacent ParDo, Read, Write, and Flatten operations Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.113Z: Fusing consumer Group by key (2)/WriteStream into Window.Into()/Window.Assign Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.115Z: Fusing consumer Group by key (3)/WriteStream into Window.Into()/Window.Assign Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.117Z: Fusing consumer Group by key (0)/WriteStream into Window.Into()/Window.Assign Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.119Z: Fusing consumer Group by key (1)/WriteStream into Window.Into()/Window.Assign Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.121Z: Fusing consumer Read input/StripIds into Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.123Z: Fusing consumer Collect start time metrics into Read input/StripIds Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.125Z: Fusing consumer Total bytes monitor into Collect start time metrics Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.128Z: Fusing consumer Window.Into()/Window.Assign into Total bytes monitor Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.130Z: Fusing consumer Group by key (1)/MergeBuckets into Group by key (1)/ReadStream Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.132Z: Fusing consumer Ungroup and reiterate (1) into Group by key (1)/MergeBuckets Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.134Z: Fusing consumer Collect end time metrics (1) into Ungroup and reiterate (1) Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.136Z: Fusing consumer Group by key (2)/MergeBuckets into Group by key (2)/ReadStream Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.138Z: Fusing consumer Ungroup and reiterate (2) into Group by key (2)/MergeBuckets Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.140Z: Fusing consumer Collect end time metrics (2) into Ungroup and reiterate (2) Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.142Z: Fusing consumer Group by key (3)/MergeBuckets into Group by key (3)/ReadStream Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.145Z: Fusing consumer Ungroup and reiterate (3) into Group by key (3)/MergeBuckets Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.147Z: Fusing consumer Collect end time metrics (3) into Ungroup and reiterate (3) Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.149Z: Fusing consumer Group by key (0)/MergeBuckets into Group by key (0)/ReadStream Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.151Z: Fusing consumer Ungroup and reiterate (0) into Group by key (0)/MergeBuckets Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.154Z: Fusing consumer Collect end time metrics (0) into Ungroup and reiterate (0) Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:23.352Z: Starting 16 workers... Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:26.015Z: Executing operation Group by key (2)/ReadStream+Group by key (2)/MergeBuckets+Ungroup and reiterate (2)+Collect end time metrics (2) Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:26.017Z: Executing operation Group by key (0)/ReadStream+Group by key (0)/MergeBuckets+Ungroup and reiterate (0)+Collect end time metrics (0) Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:26.018Z: Executing operation Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read input/StripIds+Collect start time metrics+Total bytes monitor+Window.Into()/Window.Assign+Group by key (2)/WriteStream+Group by key (3)/WriteStream+Group by key (0)/WriteStream+Group by key (1)/WriteStream Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:26.020Z: Executing operation Group by key (3)/ReadStream+Group by key (3)/MergeBuckets+Ungroup and reiterate (3)+Collect end time metrics (3) Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:26.021Z: Executing operation Group by key (1)/ReadStream+Group by key (1)/MergeBuckets+Ungroup and reiterate (1)+Collect end time metrics (1) Mar 23, 2020 1:06:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2020-03-23T13:06:32.425Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Mar 23, 2020 1:06:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:54.141Z: Checking permissions granted to controller Service Account. Mar 23, 2020 1:06:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:58.483Z: Worker configuration: n1-standard-4 in us-central1-c. Mar 23, 2020 1:07:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:07:07.329Z: Workers have started successfully. > Task :sdks:java:testing:load-tests:run FAILED The message received from the daemon indicates that the daemon has disappeared. Build request sent: Build{id=e91fe4ff-de20-4abd-b355-5ece6bd2246f, currentDir=<https://builds.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_Streaming/ws/src}> Attempting to read last messages from the daemon log... Daemon pid: 9601 log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-9601.out.log ----- Last 20 lines from daemon log file - daemon-9601.out.log ----- INFO: 2020-03-23T13:06:23.352Z: Starting 16 workers... Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:26.015Z: Executing operation Group by key (2)/ReadStream+Group by key (2)/MergeBuckets+Ungroup and reiterate (2)+Collect end time metrics (2) Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:26.017Z: Executing operation Group by key (0)/ReadStream+Group by key (0)/MergeBuckets+Ungroup and reiterate (0)+Collect end time metrics (0) Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:26.018Z: Executing operation Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read input/StripIds+Collect start time metrics+Total bytes monitor+Window.Into()/Window.Assign+Group by key (2)/WriteStream+Group by key (3)/WriteStream+Group by key (0)/WriteStream+Group by key (1)/WriteStream Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:26.020Z: Executing operation Group by key (3)/ReadStream+Group by key (3)/MergeBuckets+Ungroup and reiterate (3)+Collect end time metrics (3) Mar 23, 2020 1:06:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:26.021Z: Executing operation Group by key (1)/ReadStream+Group by key (1)/MergeBuckets+Ungroup and reiterate (1)+Collect end time metrics (1) Mar 23, 2020 1:06:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2020-03-23T13:06:32.425Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Mar 23, 2020 1:06:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:54.141Z: Checking permissions granted to controller Service Account. Mar 23, 2020 1:06:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:06:58.483Z: Worker configuration: n1-standard-4 in us-central1-c. Mar 23, 2020 1:07:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-03-23T13:07:07.329Z: Workers have started successfully. Daemon vm is shutting down... The daemon has exited normally or was terminated in response to a user interrupt. ----- End of the daemon log ----- FAILURE: Build failed with an exception. * What went wrong: Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed) * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
