See <https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_Streaming/638/display/redirect?page=changes>
Changes: [Boyuan Zhang] [BEAM-11325] Support KafkaIO dynamic read [Kyle Weaver] [BEAM-10925] Enable user-defined Java scalar functions in ZetaSQL. [sychen] Fix the check on maxBufferingDuration [Kyle Weaver] address review comments [noreply] Remove an unused reference to staleTimerSet and reword the commentary. [noreply] [BEAM-11715] Partial revert of "Combiner packing in Dataflow" (#13763) ------------------------------------------ [...truncated 66.46 KB...] > Task :model:job-management:processResources UP-TO-DATE > Task :sdks:java:expansion-service:processResources NO-SOURCE > Task :sdks:java:core:createCheckerFrameworkManifest UP-TO-DATE > Task :model:fn-execution:processResources UP-TO-DATE > Task :sdks:java:harness:processResources NO-SOURCE > Task > :runners:google-cloud-dataflow-java:****:legacy-****:createCheckerFrameworkManifest > UP-TO-DATE > Task > :runners:google-cloud-dataflow-java:****:windmill:createCheckerFrameworkManifest > UP-TO-DATE > Task :sdks:java:io:kafka:createCheckerFrameworkManifest UP-TO-DATE > Task :runners:google-cloud-dataflow-java:****:legacy-****:processResources > NO-SOURCE > Task :sdks:java:io:kinesis:createCheckerFrameworkManifest UP-TO-DATE > Task :runners:core-construction-java:processResources NO-SOURCE > Task :sdks:java:io:kafka:processResources NO-SOURCE > Task :sdks:java:io:synthetic:createCheckerFrameworkManifest UP-TO-DATE > Task :sdks:java:testing:load-tests:createCheckerFrameworkManifest UP-TO-DATE > Task :sdks:java:testing:test-utils:createCheckerFrameworkManifest UP-TO-DATE > Task :sdks:java:io:kinesis:processResources NO-SOURCE > Task :sdks:java:core:generateAvroProtocol NO-SOURCE > Task :sdks:java:io:synthetic:processResources NO-SOURCE > Task :sdks:java:testing:load-tests:processResources NO-SOURCE > Task :sdks:java:testing:test-utils:processResources NO-SOURCE > Task :sdks:java:core:generateAvroJava NO-SOURCE > Task :sdks:java:core:generateGrammarSource UP-TO-DATE > Task :sdks:java:core:processResources UP-TO-DATE > Task :model:pipeline:extractIncludeProto UP-TO-DATE > Task :model:pipeline:extractProto UP-TO-DATE > Task :runners:google-cloud-dataflow-java:****:windmill:extractIncludeProto > UP-TO-DATE > Task :runners:google-cloud-dataflow-java:****:windmill:extractProto UP-TO-DATE > Task :runners:google-cloud-dataflow-java:****:windmill:generateProto > UP-TO-DATE > Task :model:pipeline:generateProto UP-TO-DATE > Task :runners:google-cloud-dataflow-java:****:windmill:compileJava UP-TO-DATE > Task :runners:google-cloud-dataflow-java:****:windmill:processResources > UP-TO-DATE > Task :runners:google-cloud-dataflow-java:****:windmill:classes UP-TO-DATE > Task :model:pipeline:compileJava UP-TO-DATE > Task :model:pipeline:processResources UP-TO-DATE > Task :model:pipeline:classes UP-TO-DATE > Task :model:pipeline:jar UP-TO-DATE > Task :model:pipeline:shadowJar UP-TO-DATE > Task :runners:google-cloud-dataflow-java:****:windmill:shadowJar UP-TO-DATE > Task :model:job-management:extractIncludeProto UP-TO-DATE > Task :model:fn-execution:extractIncludeProto UP-TO-DATE > Task :model:fn-execution:generateProto UP-TO-DATE > Task :model:job-management:generateProto UP-TO-DATE > Task :model:job-management:compileJava UP-TO-DATE > Task :model:job-management:classes UP-TO-DATE > Task :model:fn-execution:compileJava UP-TO-DATE > Task :model:fn-execution:classes UP-TO-DATE > Task :model:job-management:shadowJar UP-TO-DATE > Task :model:fn-execution:shadowJar UP-TO-DATE > Task :sdks:java:core:compileJava UP-TO-DATE > Task :sdks:java:core:classes UP-TO-DATE > Task :sdks:java:core:shadowJar UP-TO-DATE > Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE > Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE > Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE > Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE > Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE > Task :sdks:java:extensions:protobuf:classes UP-TO-DATE > Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE > Task :sdks:java:extensions:protobuf:jar UP-TO-DATE > Task :sdks:java:fn-execution:compileJava UP-TO-DATE > Task :sdks:java:fn-execution:classes UP-TO-DATE > Task :sdks:java:fn-execution:jar UP-TO-DATE > Task :sdks:java:io:synthetic:compileJava UP-TO-DATE > Task :sdks:java:io:synthetic:classes UP-TO-DATE > Task :sdks:java:io:synthetic:jar UP-TO-DATE > Task :sdks:java:testing:test-utils:compileJava UP-TO-DATE > Task :sdks:java:testing:test-utils:classes UP-TO-DATE > Task :runners:core-construction-java:compileJava UP-TO-DATE > Task :runners:core-construction-java:classes UP-TO-DATE > Task :sdks:java:testing:test-utils:jar UP-TO-DATE > Task :runners:core-construction-java:jar UP-TO-DATE > Task :sdks:java:io:kinesis:compileJava UP-TO-DATE > Task :sdks:java:io:kinesis:classes UP-TO-DATE > Task :runners:core-java:compileJava UP-TO-DATE > Task :runners:core-java:classes UP-TO-DATE > Task :runners:core-java:jar UP-TO-DATE > Task :sdks:java:io:kinesis:jar UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE > Task :sdks:java:harness:compileJava UP-TO-DATE > Task :sdks:java:harness:classes UP-TO-DATE > Task :sdks:java:harness:jar UP-TO-DATE > Task :sdks:java:harness:shadowJar UP-TO-DATE > Task :runners:java-fn-execution:compileJava UP-TO-DATE > Task :runners:java-fn-execution:classes UP-TO-DATE > Task :runners:java-fn-execution:jar UP-TO-DATE > Task :sdks:java:expansion-service:compileJava UP-TO-DATE > Task :sdks:java:expansion-service:classes UP-TO-DATE > Task :sdks:java:expansion-service:jar UP-TO-DATE > Task :sdks:java:io:kafka:compileJava UP-TO-DATE > Task :sdks:java:io:kafka:classes UP-TO-DATE > Task :sdks:java:io:kafka:jar UP-TO-DATE > Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE > Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE > Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE > Task :runners:google-cloud-dataflow-java:compileJava UP-TO-DATE > Task :runners:google-cloud-dataflow-java:classes UP-TO-DATE > Task :runners:google-cloud-dataflow-java:jar UP-TO-DATE > Task :sdks:java:testing:load-tests:compileJava UP-TO-DATE > Task :sdks:java:testing:load-tests:classes UP-TO-DATE > Task :sdks:java:testing:load-tests:jar UP-TO-DATE > Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava > UP-TO-DATE > Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE > Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar UP-TO-DATE > Task :sdks:java:testing:load-tests:run Feb 05, 2021 12:39:19 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create INFO: No stagingLocation provided, falling back to gcpTempLocation Feb 05, 2021 12:39:20 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 187 files. Enable logging at DEBUG level to see which files will be staged. Feb 05, 2021 12:39:25 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. Feb 05, 2021 12:39:29 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Uploading 188 files from PipelineOptions.filesToStage to staging location to prepare for execution. Feb 05, 2021 12:39:30 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage INFO: Staging custom dataflow-****.jar as beam-runners-google-cloud-dataflow-java-legacy-****-2.29.0-SNAPSHOT-A6m4bxgyKMMyKxqw0gkVdW8WGw4nJLEEWxooQpL0-zk.jar Feb 05, 2021 12:39:34 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 188 files cached, 0 files newly uploaded in 4 seconds Feb 05, 2021 12:39:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1 Feb 05, 2021 12:39:34 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6b4283c4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@d0865a3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@636bbbbb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7eae3764, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@10dc7d6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f668f29, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@716e431d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e744f43, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@11a8042c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6a4ccef7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@69391e08, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@35eb4a3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@64b3b1ce, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6884f0d9, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49ec6a9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26b95b0b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f7da3d3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@103082dd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a22bad6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56afdf9a] Feb 05, 2021 12:39:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/StripIds as step s2 Feb 05, 2021 12:39:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect start time metrics as step s3 Feb 05, 2021 12:39:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Total bytes monitor as step s4 Feb 05, 2021 12:39:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Window.Into()/Window.Assign as step s5 Feb 05, 2021 12:39:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Group by key (0) as step s6 Feb 05, 2021 12:39:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Ungroup and reiterate (0) as step s7 Feb 05, 2021 12:39:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect end time metrics (0) as step s8 Feb 05, 2021 12:39:34 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/ Feb 05, 2021 12:39:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <82670 bytes, hash 83650ea3d7d719d321837a180dd720bfbaf618b0b2c8d6325107365e2a99b61d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-g2UOo9fXGdMhg3oYDdcgv7r2GLCyyNYyUQc2XiqZth0.pb Feb 05, 2021 12:39:35 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Dataflow SDK version: 2.29.0-SNAPSHOT Feb 05, 2021 12:39:36 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-05_04_39_35-16784125508557749774?project=apache-beam-testing Feb 05, 2021 12:39:36 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Submitted job: 2021-02-05_04_39_35-16784125508557749774 Feb 05, 2021 12:39:36 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel > --region=us-central1 2021-02-05_04_39_35-16784125508557749774 Feb 05, 2021 12:39:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2021-02-05T12:39:38.862Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java0dataflow0streaming0gbk02-jenkins-020512392-jrgr. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions Feb 05, 2021 12:39:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T12:39:42.931Z: Worker configuration: n1-standard-4 in us-central1-f. Feb 05, 2021 12:39:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T12:39:43.487Z: Expanding CoGroupByKey operations into optimizable parts. Feb 05, 2021 12:39:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T12:39:43.695Z: Expanding SplittableProcessKeyed operations into optimizable parts. Feb 05, 2021 12:39:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T12:39:43.729Z: Expanding GroupByKey operations into streaming Read/Write steps Feb 05, 2021 12:39:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T12:39:43.802Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns Feb 05, 2021 12:39:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T12:39:43.908Z: Fusing adjacent ParDo, Read, Write, and Flatten operations Feb 05, 2021 12:39:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T12:39:43.941Z: Fusing consumer Read input/StripIds into Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds Feb 05, 2021 12:39:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T12:39:43.975Z: Fusing consumer Collect start time metrics into Read input/StripIds Feb 05, 2021 12:39:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T12:39:44.023Z: Fusing consumer Total bytes monitor into Collect start time metrics Feb 05, 2021 12:39:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T12:39:44.056Z: Fusing consumer Window.Into()/Window.Assign into Total bytes monitor Feb 05, 2021 12:39:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T12:39:44.093Z: Fusing consumer Group by key (0)/WriteStream into Window.Into()/Window.Assign Feb 05, 2021 12:39:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T12:39:44.128Z: Fusing consumer Group by key (0)/MergeBuckets into Group by key (0)/ReadStream Feb 05, 2021 12:39:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T12:39:44.156Z: Fusing consumer Ungroup and reiterate (0) into Group by key (0)/MergeBuckets Feb 05, 2021 12:39:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T12:39:44.215Z: Fusing consumer Collect end time metrics (0) into Ungroup and reiterate (0) Feb 05, 2021 12:39:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T12:39:45.398Z: Executing operation Group by key (0)/ReadStream+Group by key (0)/MergeBuckets+Ungroup and reiterate (0)+Collect end time metrics (0) Feb 05, 2021 12:39:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T12:39:45.437Z: Executing operation Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read input/StripIds+Collect start time metrics+Total bytes monitor+Window.Into()/Window.Assign+Group by key (0)/WriteStream Feb 05, 2021 12:39:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T12:39:45.469Z: Starting 5 ****s in us-central1-f... Feb 05, 2021 12:40:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T12:40:15.449Z: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate. Feb 05, 2021 12:40:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T12:40:15.563Z: Resized **** pool to 3, though goal was 5. This could be a quota issue. Feb 05, 2021 12:40:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T12:40:15.664Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Feb 05, 2021 12:40:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T12:40:26.134Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. Feb 05, 2021 12:40:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T12:40:48.059Z: Workers have started successfully. Feb 05, 2021 12:40:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T12:40:48.088Z: Workers have started successfully. Feb 05, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T16:00:34.284Z: Cancel request is committed for workflow job: 2021-02-05_04_39_35-16784125508557749774. Feb 05, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T16:00:34.308Z: Finished operation Group by key (0)/ReadStream+Group by key (0)/MergeBuckets+Ungroup and reiterate (0)+Collect end time metrics (0) Feb 05, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T16:00:34.309Z: Finished operation Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read input/StripIds+Collect start time metrics+Total bytes monitor+Window.Into()/Window.Assign+Group by key (0)/WriteStream Feb 05, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T16:00:34.497Z: Cleaning up. Feb 05, 2021 4:00:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T16:00:34.841Z: Stopping **** pool... Feb 05, 2021 4:01:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T16:01:30.015Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate. Feb 05, 2021 4:01:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-02-05T16:01:30.072Z: Worker pool stopped. Feb 05, 2021 4:01:37 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState INFO: Job 2021-02-05_04_39_35-16784125508557749774 finished with status CANCELLED. Load test results for test (ID): c1058f5d-8588-4243-a0ec-76c2b92a00b2 and timestamp: 2021-02-05T12:39:22.066000000Z: Metric: Value: dataflow_runtime_sec 11878.593 dataflow_total_bytes_count 1.999998E9 Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED. at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55) at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:137) at org.apache.beam.sdk.loadtests.GroupByKeyLoadTest.run(GroupByKeyLoadTest.java:57) at org.apache.beam.sdk.loadtests.GroupByKeyLoadTest.main(GroupByKeyLoadTest.java:131) > Task :sdks:java:testing:load-tests:run FAILED FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:testing:load-tests:run'. > Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with > non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 3h 22m 53s 90 actionable tasks: 1 executed, 89 up-to-date Publishing build scan... https://gradle.com/s/ryatp6gysb64m Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
