See <https://builds.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_Streaming/351/display/redirect?page=changes>
Changes: [robertwb] [BEAM-9577] New artifact staging and retrieval service for Java. [robertwb] [BEAM-9577] Multi-threaded artifact staging service backend. [mxm] [BEAM-9733] Always let ImpulseSourceFunction emit a final watermark [mxm] [BEAM-9794] Reduce state cells needed for BufferingDoFnRunner [apilloud] [BEAM-9514] Ensure nullability passes through sum [lcwik] [BEAM-2939] Expose HasProgress interface for restriction trackers and [lcwik] [BEAM-2939] Add the ability for SDK harness runners to provide [lcwik] [BEAM-2939] Integrate progress reporting for splittable dofns. [lcwik] [BEAM-2939] Fold Sizes sub-interfaces into RestrictionTracker [lcwik] [BEAM-2939] Drop HasSize in favor of using [mxm] [BEAM-9733] Make up for timers set while processing the bundle [robertwb] Use futures, better error handling. [github] [BEAM-9701] Increments fastavro version range upper bound to 0.24. [pabloem] [BEAM-9812] Fixing bug causing pipelines requiring temp tables to not [ehudm] [BEAM-7405] Workaround for bad Docker config [boyuanz] [BEAM-8871] Support trySplit for ByteKeyRangeTracker [tweise] [BEAM-9811] Nightly snapshot publish error [github] [BEAM-9775] Adding Go SDF example, adjusting GetProgress signature. ------------------------------------------ [...truncated 66.93 KB...] > Task :runners:google-cloud-dataflow-java:processResources UP-TO-DATE > Task :sdks:java:io:kafka:processResources NO-SOURCE > Task :sdks:java:io:synthetic:processResources NO-SOURCE > Task :sdks:java:testing:test-utils:processResources NO-SOURCE > Task :sdks:java:testing:load-tests:processResources NO-SOURCE > Task :model:fn-execution:processResources UP-TO-DATE > Task :sdks:java:core:generateGrammarSource UP-TO-DATE > Task :sdks:java:core:processResources UP-TO-DATE > Task :model:pipeline:extractIncludeProto UP-TO-DATE > Task :runners:google-cloud-dataflow-java:worker:windmill:extractIncludeProto > UP-TO-DATE > Task :model:pipeline:extractProto UP-TO-DATE > Task :runners:google-cloud-dataflow-java:worker:windmill:extractProto > UP-TO-DATE > Task :model:pipeline:generateProto UP-TO-DATE > Task :runners:google-cloud-dataflow-java:worker:windmill:generateProto > UP-TO-DATE > Task :runners:google-cloud-dataflow-java:worker:windmill:compileJava > UP-TO-DATE > Task :runners:google-cloud-dataflow-java:worker:windmill:processResources > UP-TO-DATE > Task :runners:google-cloud-dataflow-java:worker:windmill:classes UP-TO-DATE > Task :model:pipeline:compileJava UP-TO-DATE > Task :model:pipeline:processResources UP-TO-DATE > Task :model:pipeline:classes UP-TO-DATE > Task :runners:google-cloud-dataflow-java:worker:windmill:shadowJar UP-TO-DATE > Task :model:pipeline:jar UP-TO-DATE > Task :model:pipeline:shadowJar UP-TO-DATE > Task :model:fn-execution:extractIncludeProto UP-TO-DATE > Task :model:job-management:extractIncludeProto UP-TO-DATE > Task :model:job-management:generateProto UP-TO-DATE > Task :model:fn-execution:generateProto UP-TO-DATE > Task :model:job-management:compileJava UP-TO-DATE > Task :model:job-management:classes UP-TO-DATE > Task :model:job-management:shadowJar UP-TO-DATE > Task :model:fn-execution:compileJava UP-TO-DATE > Task :model:fn-execution:classes UP-TO-DATE > Task :model:fn-execution:shadowJar UP-TO-DATE > Task :sdks:java:core:compileJava UP-TO-DATE > Task :sdks:java:core:classes UP-TO-DATE > Task :sdks:java:core:shadowJar UP-TO-DATE > Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE > Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE > Task :runners:local-java:compileJava UP-TO-DATE > Task :runners:local-java:classes UP-TO-DATE > Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE > Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE > Task :sdks:java:io:synthetic:compileJava UP-TO-DATE > Task :sdks:java:io:synthetic:classes UP-TO-DATE > Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE > Task :sdks:java:io:synthetic:jar UP-TO-DATE > Task :sdks:java:fn-execution:compileJava UP-TO-DATE > Task :sdks:java:fn-execution:classes UP-TO-DATE > Task :sdks:java:fn-execution:jar UP-TO-DATE > Task :runners:local-java:jar UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE > Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE > Task :sdks:java:extensions:protobuf:classes UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE > Task :sdks:java:extensions:protobuf:jar UP-TO-DATE > Task :sdks:java:testing:test-utils:compileJava UP-TO-DATE > Task :sdks:java:testing:test-utils:classes UP-TO-DATE > Task :sdks:java:testing:test-utils:jar UP-TO-DATE > Task :sdks:java:io:kinesis:compileJava UP-TO-DATE > Task :sdks:java:io:kinesis:classes UP-TO-DATE > Task :sdks:java:io:kinesis:jar UP-TO-DATE > Task :runners:core-construction-java:compileJava UP-TO-DATE > Task :runners:core-construction-java:classes UP-TO-DATE > Task :runners:core-construction-java:jar UP-TO-DATE > Task :sdks:java:expansion-service:compileJava UP-TO-DATE > Task :sdks:java:expansion-service:classes UP-TO-DATE > Task :sdks:java:expansion-service:jar UP-TO-DATE > Task :sdks:java:io:kafka:compileJava UP-TO-DATE > Task :sdks:java:io:kafka:classes UP-TO-DATE > Task :sdks:java:io:kafka:jar UP-TO-DATE > Task :runners:core-java:compileJava UP-TO-DATE > Task :runners:core-java:classes UP-TO-DATE > Task :runners:core-java:jar UP-TO-DATE > Task :sdks:java:harness:compileJava UP-TO-DATE > Task :sdks:java:harness:classes UP-TO-DATE > Task :sdks:java:harness:jar UP-TO-DATE > Task :sdks:java:harness:shadowJar UP-TO-DATE > Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE > Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE > Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE > Task :runners:java-fn-execution:compileJava UP-TO-DATE > Task :runners:java-fn-execution:classes UP-TO-DATE > Task :runners:java-fn-execution:jar UP-TO-DATE > Task :runners:google-cloud-dataflow-java:compileJava UP-TO-DATE > Task :runners:google-cloud-dataflow-java:classes UP-TO-DATE > Task :runners:google-cloud-dataflow-java:jar UP-TO-DATE > Task :runners:direct-java:compileJava UP-TO-DATE > Task :runners:direct-java:classes UP-TO-DATE > Task :runners:direct-java:shadowJar UP-TO-DATE > Task :sdks:java:testing:load-tests:compileJava UP-TO-DATE > Task :sdks:java:testing:load-tests:classes UP-TO-DATE > Task :sdks:java:testing:load-tests:jar UP-TO-DATE > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava > UP-TO-DATE > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:classes > UP-TO-DATE > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar > UP-TO-DATE > Task :sdks:java:testing:load-tests:run Apr 24, 2020 12:35:21 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create INFO: No stagingLocation provided, falling back to gcpTempLocation Apr 24, 2020 12:35:21 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 173 files. Enable logging at DEBUG level to see which files will be staged. Apr 24, 2020 12:35:22 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. Apr 24, 2020 12:35:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Uploading 174 files from PipelineOptions.filesToStage to staging location to prepare for execution. Apr 24, 2020 12:35:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 174 files cached, 0 files newly uploaded in 0 seconds Apr 24, 2020 12:35:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1 Apr 24, 2020 12:35:23 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@65004ff6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4cafa9aa, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@562c877a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@67001148, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@989da1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@31cb96e1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3eed0f5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@64030b91, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2032e725, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4d23015c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@383f1975, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441cc260, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@73a00e09, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26dcd8c0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@66e889df, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@444548a0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3766c667, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@773c0293, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@55b8dbda, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3b569985] Apr 24, 2020 12:35:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/StripIds as step s2 Apr 24, 2020 12:35:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect start time metrics as step s3 Apr 24, 2020 12:35:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Total bytes monitor as step s4 Apr 24, 2020 12:35:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Window.Into()/Window.Assign as step s5 Apr 24, 2020 12:35:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Group by key (0) as step s6 Apr 24, 2020 12:35:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Ungroup and reiterate (0) as step s7 Apr 24, 2020 12:35:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect end time metrics (0) as step s8 Apr 24, 2020 12:35:23 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/ Apr 24, 2020 12:35:24 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Dataflow SDK version: 2.22.0-SNAPSHOT Apr 24, 2020 12:35:25 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-24_05_35_24-10737890374533963592?project=apache-beam-testing Apr 24, 2020 12:35:25 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Submitted job: 2020-04-24_05_35_24-10737890374533963592 Apr 24, 2020 12:35:25 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel > --region=us-central1 2020-04-24_05_35_24-10737890374533963592 Apr 24, 2020 12:35:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-24T12:35:28.101Z: Checking permissions granted to controller Service Account. Apr 24, 2020 12:35:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2020-04-24T12:35:28.164Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java0dataflow0streaming0gbk02-jenkins-042412352-1hoj. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions Apr 24, 2020 12:35:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-24T12:35:31.090Z: Worker configuration: n1-standard-4 in us-central1-f. Apr 24, 2020 12:35:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-24T12:35:31.760Z: Expanding CoGroupByKey operations into optimizable parts. Apr 24, 2020 12:35:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-24T12:35:31.780Z: Expanding SplittableProcessKeyed operations into optimizable parts. Apr 24, 2020 12:35:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-24T12:35:31.783Z: Expanding GroupByKey operations into streaming Read/Write steps Apr 24, 2020 12:35:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-24T12:35:31.791Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns Apr 24, 2020 12:35:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-24T12:35:31.994Z: Fusing adjacent ParDo, Read, Write, and Flatten operations Apr 24, 2020 12:35:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-24T12:35:31.997Z: Fusing consumer Read input/StripIds into Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds Apr 24, 2020 12:35:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-24T12:35:32.000Z: Fusing consumer Collect start time metrics into Read input/StripIds Apr 24, 2020 12:35:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-24T12:35:32.004Z: Fusing consumer Total bytes monitor into Collect start time metrics Apr 24, 2020 12:35:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-24T12:35:32.007Z: Fusing consumer Window.Into()/Window.Assign into Total bytes monitor Apr 24, 2020 12:35:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-24T12:35:32.010Z: Fusing consumer Group by key (0)/WriteStream into Window.Into()/Window.Assign Apr 24, 2020 12:35:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-24T12:35:32.013Z: Fusing consumer Group by key (0)/MergeBuckets into Group by key (0)/ReadStream Apr 24, 2020 12:35:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-24T12:35:32.016Z: Fusing consumer Ungroup and reiterate (0) into Group by key (0)/MergeBuckets Apr 24, 2020 12:35:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-24T12:35:32.019Z: Fusing consumer Collect end time metrics (0) into Ungroup and reiterate (0) Apr 24, 2020 12:35:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-24T12:35:32.280Z: Starting 5 workers... Apr 24, 2020 12:35:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-24T12:35:34.867Z: Executing operation Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read input/StripIds+Collect start time metrics+Total bytes monitor+Window.Into()/Window.Assign+Group by key (0)/WriteStream Apr 24, 2020 12:35:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-24T12:35:34.870Z: Executing operation Group by key (0)/ReadStream+Group by key (0)/MergeBuckets+Ungroup and reiterate (0)+Collect end time metrics (0) Apr 24, 2020 12:36:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2020-04-24T12:36:01.157Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Apr 24, 2020 12:36:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-24T12:36:11.270Z: Checking permissions granted to controller Service Account. Apr 24, 2020 12:36:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-24T12:36:14.344Z: Worker configuration: n1-standard-4 in us-central1-f. Apr 24, 2020 12:36:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-24T12:36:23.628Z: Workers have started successfully. Apr 24, 2020 12:42:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-24T12:42:11.269Z: Checking permissions granted to controller Service Account. Apr 24, 2020 12:48:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-24T12:48:11.268Z: Checking permissions granted to controller Service Account. Apr 24, 2020 12:54:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-04-24T12:54:11.268Z: Checking permissions granted to controller Service Account. > Task :sdks:java:testing:load-tests:run FAILED FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:testing:load-tests:run'. > Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with > non-zero exit value 143 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 19m 47s 72 actionable tasks: 1 executed, 71 up-to-date The message received from the daemon indicates that the daemon has disappeared. Build request sent: Build{id=c63e2163-3419-4e91-804c-13ae917ad332, currentDir=<https://builds.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_Streaming/ws/src}> Attempting to read last messages from the daemon log... Daemon pid: 12919 log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-12919.out.log ----- Last 20 lines from daemon log file - daemon-12919.out.log ----- INFO: 2020-04-24T12:54:11.268Z: Checking permissions granted to controller Service Account. FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:testing:load-tests:run'. > Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with > non-zero exit value 143 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 19m 47s 72 actionable tasks: 1 executed, 71 up-to-date Daemon vm is shutting down... The daemon has exited normally or was terminated in response to a user interrupt. ----- End of the daemon log ----- FAILURE: Build failed with an exception. * What went wrong: Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed) * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
