See <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/80/display/redirect?page=changes>
Changes: [noreply] [BEAM-12769] Adds support for expanding a Java cross-language transform ------------------------------------------ [...truncated 46.96 KB...] + tee /output/licenses/list.csv google.golang.org/grpc,https://github.com/grpc/grpc-go/blob/master/LICENSE,Apache-2.0 google.golang.org/genproto/googleapis/rpc/status,https://github.com/googleapis/go-genproto/blob/master/LICENSE,Apache-2.0 golang.org/x/text,https://go.googlesource.com/text/+/refs/heads/master/LICENSE,BSD-3-Clause github.com/apache/beam/sdks/go/pkg/beam,https://github.com/apache/beam/blob/master/sdks/go/README.md,Apache-2.0 golang.org/x/net,https://go.googlesource.com/net/+/refs/heads/master/LICENSE,BSD-3-Clause github.com/golang/protobuf,https://github.com/golang/protobuf/blob/master/LICENSE,BSD-3-Clause google.golang.org/protobuf,https://go.googlesource.com/protobuf/+/refs/heads/master/LICENSE,BSD-3-Clause golang.org/x/sys,https://go.googlesource.com/sys/+/refs/heads/master/LICENSE,BSD-3-Clause github.com/apache/beam/sdks/java/container,https://github.com/apache/beam/blob/master/LICENSE,Apache-2.0 + chmod -R a+w /output/licenses > Task :release:go-licenses:java:createLicenses > Task :sdks:java:container:java11:copyGolangLicenses > Task :sdks:java:container:java11:dockerPrepare > Task :sdks:java:container:java11:docker > Task :runners:google-cloud-dataflow-java:buildAndPushDockerContainer WARNING: `gcloud docker` will not be supported for Docker client versions above 18.03. As an alternative, use `gcloud auth configure-docker` to configure `docker` to use `gcloud` as a credential helper, then use `docker` as you would for non-GCR registries, e.g. `docker pull gcr.io/project-id/my-image`. Add `--verbosity=error` to silence this warning: `gcloud docker --verbosity=error -- pull gcr.io/project-id/my-image`. See: https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java] b20f463f9ace: Preparing 2d6e773a2559: Preparing 1bb092d84987: Preparing 67a0736b45eb: Preparing 622bbde18fa3: Preparing d82e52c2c51a: Preparing c49a815bf0c9: Preparing 0a3d06b8585e: Preparing 7436ee851b24: Preparing 6fc005b9b4e7: Preparing c6397b8ce55b: Preparing 2f13a1debaac: Preparing 3891808a925b: Preparing d402f4f1b906: Preparing 00ef5416d927: Preparing 8555e663f65b: Preparing 0a3d06b8585e: Waiting c49a815bf0c9: Waiting d00da3cd7763: Preparing 4e61e63529c2: Preparing 799760671c38: Preparing d82e52c2c51a: Waiting 3891808a925b: Waiting c6397b8ce55b: Waiting 6fc005b9b4e7: Waiting d402f4f1b906: Waiting 2f13a1debaac: Waiting 4e61e63529c2: Waiting 799760671c38: Waiting d00da3cd7763: Waiting 8555e663f65b: Waiting 622bbde18fa3: Pushed 2d6e773a2559: Pushed 1bb092d84987: Pushed b20f463f9ace: Pushed 67a0736b45eb: Pushed 7436ee851b24: Pushed d82e52c2c51a: Pushed 0a3d06b8585e: Pushed 3891808a925b: Layer already exists d402f4f1b906: Layer already exists 00ef5416d927: Layer already exists 8555e663f65b: Layer already exists d00da3cd7763: Layer already exists 4e61e63529c2: Layer already exists 799760671c38: Layer already exists c49a815bf0c9: Pushed 2f13a1debaac: Pushed c6397b8ce55b: Pushed 6fc005b9b4e7: Pushed 20210905133327: digest: sha256:0af8f0051781499546a9c060eefd0edba2cfa3d05219a15dc34c5964e2e2b98b size: 4311 > Task :sdks:java:testing:load-tests:run Sep 05, 2021 1:36:06 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create INFO: No stagingLocation provided, falling back to gcpTempLocation Sep 05, 2021 1:36:06 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 193 files. Enable logging at DEBUG level to see which files will be staged. Sep 05, 2021 1:36:07 PM org.apache.beam.sdk.Pipeline validate WARNING: The following transforms do not have stable unique names: ParDo(TimeMonitor) Sep 05, 2021 1:36:07 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. Sep 05, 2021 1:36:10 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/ Sep 05, 2021 1:36:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <108583 bytes, hash f67f48fdbc0cb1c29ff59e0ed3d2e233158942126494d6ec930c4098722cafb9> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-9n9I_bwMscKf9Z4O09LiMxWJQhJklNbskwxAmHIsr7k.pb Sep 05, 2021 1:36:11 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Uploading 193 files from PipelineOptions.filesToStage to staging location to prepare for execution. Sep 05, 2021 1:36:12 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 193 files cached, 0 files newly uploaded in 0 seconds Sep 05, 2021 1:36:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1 Sep 05, 2021 1:36:12 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@23f3da8b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5634d0f4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@252a8aae, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3d4e405e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@54e2fe, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@70972170, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@119aa36, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4e1a46fb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@69fe0ed4, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@20ab3e3a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6caf7803, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@709ed6f3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@698fee9a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@102c577f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d44a19, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1fb2d5e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1716e8c5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6573d2f7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4052c8c2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@181b8c4b] Sep 05, 2021 1:36:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/StripIds as step s2 Sep 05, 2021 1:36:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding ParDo(TimeMonitor) as step s3 Sep 05, 2021 1:36:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding ParDo(ByteMonitor) as step s4 Sep 05, 2021 1:36:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 0 as step s5 Sep 05, 2021 1:36:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 1 as step s6 Sep 05, 2021 1:36:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 2 as step s7 Sep 05, 2021 1:36:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 3 as step s8 Sep 05, 2021 1:36:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 4 as step s9 Sep 05, 2021 1:36:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 5 as step s10 Sep 05, 2021 1:36:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 6 as step s11 Sep 05, 2021 1:36:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 7 as step s12 Sep 05, 2021 1:36:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 8 as step s13 Sep 05, 2021 1:36:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 9 as step s14 Sep 05, 2021 1:36:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding ParDo(TimeMonitor)2 as step s15 Sep 05, 2021 1:36:12 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Dataflow SDK version: 2.34.0-SNAPSHOT Sep 05, 2021 1:36:13 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-05_06_36_12-5122144847036532988?project=apache-beam-testing Sep 05, 2021 1:36:13 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Submitted job: 2021-09-05_06_36_12-5122144847036532988 Sep 05, 2021 1:36:13 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel > --region=us-central1 2021-09-05_06_36_12-5122144847036532988 Sep 05, 2021 1:36:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2021-09-05T13:36:19.861Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0pardo01-jenkins-09-iyw5. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions Sep 05, 2021 1:36:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:24.819Z: Worker configuration: e2-standard-2 in us-central1-a. Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:25.600Z: Expanding SplittableParDo operations into optimizable parts. Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:25.626Z: Expanding CollectionToSingleton operations into optimizable parts. Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:25.702Z: Expanding CoGroupByKey operations into optimizable parts. Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:25.745Z: Expanding SplittableProcessKeyed operations into optimizable parts. Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:25.771Z: Expanding GroupByKey operations into streaming Read/Write steps Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:25.802Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:25.868Z: Fusing adjacent ParDo, Read, Write, and Flatten operations Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:25.899Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:25.924Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:25.957Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:25.980Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:26.008Z: Fusing consumer ParDo(TimeMonitor)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds) Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:26.043Z: Fusing consumer ParDo(ByteMonitor)/ParMultiDo(ByteMonitor) into ParDo(TimeMonitor)/ParMultiDo(TimeMonitor) Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:26.067Z: Fusing consumer Step: 0/ParMultiDo(CounterOperation) into ParDo(ByteMonitor)/ParMultiDo(ByteMonitor) Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:26.089Z: Fusing consumer Step: 1/ParMultiDo(CounterOperation) into Step: 0/ParMultiDo(CounterOperation) Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:26.120Z: Fusing consumer Step: 2/ParMultiDo(CounterOperation) into Step: 1/ParMultiDo(CounterOperation) Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:26.153Z: Fusing consumer Step: 3/ParMultiDo(CounterOperation) into Step: 2/ParMultiDo(CounterOperation) Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:26.185Z: Fusing consumer Step: 4/ParMultiDo(CounterOperation) into Step: 3/ParMultiDo(CounterOperation) Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:26.218Z: Fusing consumer Step: 5/ParMultiDo(CounterOperation) into Step: 4/ParMultiDo(CounterOperation) Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:26.251Z: Fusing consumer Step: 6/ParMultiDo(CounterOperation) into Step: 5/ParMultiDo(CounterOperation) Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:26.290Z: Fusing consumer Step: 7/ParMultiDo(CounterOperation) into Step: 6/ParMultiDo(CounterOperation) Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:26.317Z: Fusing consumer Step: 8/ParMultiDo(CounterOperation) into Step: 7/ParMultiDo(CounterOperation) Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:26.349Z: Fusing consumer Step: 9/ParMultiDo(CounterOperation) into Step: 8/ParMultiDo(CounterOperation) Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:26.389Z: Fusing consumer ParDo(TimeMonitor)2/ParMultiDo(TimeMonitor) into Step: 9/ParMultiDo(CounterOperation) Sep 05, 2021 1:36:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:26.665Z: Starting 5 ****s in us-central1-a... Sep 05, 2021 1:36:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:36:31.106Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Sep 05, 2021 1:37:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:37:25.549Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. Sep 05, 2021 1:37:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:37:53.872Z: Workers have started successfully. Sep 05, 2021 1:37:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:37:53.899Z: Workers have started successfully. Sep 05, 2021 1:42:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:42:31.392Z: Cleaning up. Sep 05, 2021 1:42:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:42:31.496Z: Stopping **** pool... Sep 05, 2021 1:42:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:42:31.570Z: Stopping **** pool... Sep 05, 2021 1:44:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:44:54.488Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate. Sep 05, 2021 1:44:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-05T13:44:54.548Z: Worker pool stopped. Sep 05, 2021 1:45:00 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState INFO: Job 2021-09-05_06_36_12-5122144847036532988 finished with status DONE. Load test results for test (ID): 38df7a45-fa07-45f5-875c-6d250fa0980f and timestamp: 2021-09-05T13:36:07.112000000Z: Metric: Value: dataflow_v2_java11_runtime_sec 153.448 dataflow_v2_java11_total_bytes_count 9.7960172E9 > Task :runners:google-cloud-dataflow-java:cleanUpDockerImages FAILED Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210905133327 Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0af8f0051781499546a9c060eefd0edba2cfa3d05219a15dc34c5964e2e2b98b ERROR: (gcloud.container.images.untag) Image could not be found: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210905133327] FAILURE: Build failed with an exception. * Where: Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 279 * What went wrong: Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerImages'. > Process 'command 'gcloud'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 12m 1s 101 actionable tasks: 71 executed, 28 from cache, 2 up-to-date Publishing build scan... https://gradle.com/s/h6fv657cv4y4a Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
