See
<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_Streaming/736/display/redirect?page=changes>
Changes:
[Ismaël Mejía] [BEAM-12342] Upgrade Spark 2 to version 2.4.8
[Udi Meiri] [BEAM-12352] Skip
GcsIOIntegrationTest.test_copy{,_batch}_rewrite_token
[noreply] [BEAM-3713] Move validatesRunnerBatchTests and
validatesRunnerStreaming
------------------------------------------
[...truncated 19.77 KB...]
> Task :sdks:java:core:processResources
> Task :runners:google-cloud-dataflow-java:****:windmill:extractIncludeProto
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :runners:google-cloud-dataflow-java:****:windmill:extractProto
> Task :runners:google-cloud-dataflow-java:****:windmill:generateProto
> FROM-CACHE
> Task :model:pipeline:generateProto FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:windmill:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:windmill:processResources
> Task :runners:google-cloud-dataflow-java:****:windmill:classes
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :runners:google-cloud-dataflow-java:****:windmill:shadowJar FROM-CACHE
> Task :model:pipeline:jar
> Task :model:pipeline:shadowJar FROM-CACHE
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:generateProto FROM-CACHE
> Task :model:job-management:generateProto FROM-CACHE
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:job-management:shadowJar FROM-CACHE
> Task :model:fn-execution:shadowJar FROM-CACHE
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar FROM-CACHE
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:fn-execution:jar
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar
> Task :runners:core-construction-java:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar FROM-CACHE
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar
> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava
> FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar FROM-CACHE
> Task :sdks:java:io:kinesis:compileJava
> Task :sdks:java:io:kinesis:classes
> Task :sdks:java:io:kinesis:jar
> Task :sdks:java:testing:load-tests:compileJava
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
> Task :sdks:java:testing:load-tests:classes
> Task :sdks:java:testing:load-tests:jar
> Task :sdks:java:testing:load-tests:run
May 21, 2021 12:09:53 PM org.apache.beam.runners.dataflow.DataflowRunner
validateSdkContainerImageOptions
WARNING: Prefer --sdkContainerImage over deprecated legacy option
--****HarnessContainerImage.
May 21, 2021 12:09:53 PM
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
create
INFO: No stagingLocation provided, falling back to gcpTempLocation
May 21, 2021 12:09:54 PM org.apache.beam.runners.dataflow.DataflowRunner
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from
the classpath: will stage 188 files. Enable logging at DEBUG level to see which
files will be staged.
May 21, 2021 12:09:54 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names:
ParDo(TimeMonitor)
May 21, 2021 12:09:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
May 21, 2021 12:09:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to
gs://temp-storage-for-perf-tests/loadtests/staging/
May 21, 2021 12:09:59 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Uploading 189 files from PipelineOptions.filesToStage to staging location
to prepare for execution.
May 21, 2021 12:09:59 PM
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes
forFileToStage
INFO: Staging custom dataflow-****.jar as
beam-runners-google-cloud-dataflow-java-legacy-****-2.31.0-SNAPSHOT-p8DZqfSGxUnLeoUu48qOOTbZOXglhdbI2awWBgSdBQs.jar
May 21, 2021 12:09:59 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 189 files cached, 0 files newly uploaded in 0
seconds
May 21, 2021 12:09:59 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as
step s1
May 21, 2021 12:10:00 PM
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes:
[org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4faa298,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1cd3b138,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@151bf776,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a6d30e2,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b52b755,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a098d76,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@40e37b06,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@733aa9d8,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6dcc40f5,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b680207,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@70887727,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@56da7487,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@599e4d41,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@328d044f,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@10f7c76,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4745e9c,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f2bff16,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@75de29c0,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fc807c1,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@296e281a]
May 21, 2021 12:10:00 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
May 21, 2021 12:10:00 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor) as step s3
May 21, 2021 12:10:00 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(ByteMonitor) as step s4
May 21, 2021 12:10:00 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 0 as step s5
May 21, 2021 12:10:00 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 1 as step s6
May 21, 2021 12:10:00 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 2 as step s7
May 21, 2021 12:10:00 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 3 as step s8
May 21, 2021 12:10:00 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 4 as step s9
May 21, 2021 12:10:00 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 5 as step s10
May 21, 2021 12:10:00 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 6 as step s11
May 21, 2021 12:10:00 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 7 as step s12
May 21, 2021 12:10:00 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 8 as step s13
May 21, 2021 12:10:00 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 9 as step s14
May 21, 2021 12:10:00 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor)2 as step s15
May 21, 2021 12:10:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.31.0-SNAPSHOT
May 21, 2021 12:10:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-21_05_10_00-13007117209361637681?project=apache-beam-testing
May 21, 2021 12:10:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-05-21_05_10_00-13007117209361637681
May 21, 2021 12:10:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
> --region=us-central1 2021-05-21_05_10_00-13007117209361637681
May 21, 2021 12:10:06 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-05-21T12:10:05.940Z: The workflow name is not a valid Cloud
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring
will be labeled with this modified job name:
load0tests0java0dataflow0streaming0pardo01-jenkins-0521120-uzih. For the best
monitoring experience, please name your job with a valid Cloud Label. For
details, see:
https://cloud.google.com/compute/docs/labeling-resources#restrictions
May 21, 2021 12:10:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:10:09.618Z: Worker configuration: n1-standard-4 in
us-central1-f.
May 21, 2021 12:10:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:10:10.346Z: Expanding CoGroupByKey operations into
optimizable parts.
May 21, 2021 12:10:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:10:10.530Z: Expanding SplittableProcessKeyed operations
into optimizable parts.
May 21, 2021 12:10:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:10:10.578Z: Expanding GroupByKey operations into streaming
Read/Write steps
May 21, 2021 12:10:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:10:10.629Z: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
May 21, 2021 12:10:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:10:10.725Z: Fusing adjacent ParDo, Read, Write, and Flatten
operations
May 21, 2021 12:10:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:10:10.762Z: Fusing consumer Read input/StripIds into Read
input/DataflowRunner.StreamingUnboundedRead.ReadWithIds
May 21, 2021 12:10:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:10:10.806Z: Fusing consumer ParDo(TimeMonitor) into Read
input/StripIds
May 21, 2021 12:10:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:10:10.842Z: Fusing consumer ParDo(ByteMonitor) into
ParDo(TimeMonitor)
May 21, 2021 12:10:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:10:10.864Z: Fusing consumer Step: 0 into ParDo(ByteMonitor)
May 21, 2021 12:10:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:10:10.893Z: Fusing consumer Step: 1 into Step: 0
May 21, 2021 12:10:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:10:10.928Z: Fusing consumer Step: 2 into Step: 1
May 21, 2021 12:10:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:10:10.970Z: Fusing consumer Step: 3 into Step: 2
May 21, 2021 12:10:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:10:11.005Z: Fusing consumer Step: 4 into Step: 3
May 21, 2021 12:10:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:10:11.040Z: Fusing consumer Step: 5 into Step: 4
May 21, 2021 12:10:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:10:11.067Z: Fusing consumer Step: 6 into Step: 5
May 21, 2021 12:10:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:10:11.105Z: Fusing consumer Step: 7 into Step: 6
May 21, 2021 12:10:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:10:11.136Z: Fusing consumer Step: 8 into Step: 7
May 21, 2021 12:10:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:10:11.171Z: Fusing consumer Step: 9 into Step: 8
May 21, 2021 12:10:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:10:11.209Z: Fusing consumer ParDo(TimeMonitor)2 into Step: 9
May 21, 2021 12:10:13 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:10:11.823Z: Executing operation Read
input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read
input/StripIds+ParDo(TimeMonitor)+ParDo(ByteMonitor)+Step: 0+Step: 1+Step:
2+Step: 3+Step: 4+Step: 5+Step: 6+Step: 7+Step: 8+Step: 9+ParDo(TimeMonitor)2
May 21, 2021 12:10:13 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:10:11.903Z: Starting 5 ****s in us-central1-f...
May 21, 2021 12:10:30 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:10:30.720Z: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
May 21, 2021 12:11:00 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:10:59.989Z: Autoscaling: Raised the number of ****s to 5 so
that the pipeline can catch up with its backlog and keep up with its input rate.
May 21, 2021 12:11:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:11:31.561Z: Workers have started successfully.
May 21, 2021 12:11:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:11:31.604Z: Workers have started successfully.
May 21, 2021 12:13:59 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:13:58.493Z: Finished operation Read
input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read
input/StripIds+ParDo(TimeMonitor)+ParDo(ByteMonitor)+Step: 0+Step: 1+Step:
2+Step: 3+Step: 4+Step: 5+Step: 6+Step: 7+Step: 8+Step: 9+ParDo(TimeMonitor)2
May 21, 2021 12:13:59 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:13:58.652Z: Cleaning up.
May 21, 2021 12:13:59 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:13:58.784Z: Stopping **** pool...
May 21, 2021 12:14:56 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:14:53.947Z: Autoscaling: Reduced the number of ****s to 0
based on low average **** CPU utilization, and the pipeline having sufficiently
low backlog and keeping up with input rate.
May 21, 2021 12:14:56 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:14:53.995Z: Worker pool stopped.
May 21, 2021 12:20:00 PM org.apache.beam.runners.dataflow.DataflowPipelineJob
logTerminalState
INFO: Job 2021-05-21_05_10_00-13007117209361637681 finished with status DONE.
May 21, 2021 12:20:01 PM org.apache.beam.sdk.testutils.metrics.MetricsReader
getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace pardo
Load test results for test (ID): f5a9d29d-15c7-43d0-97b5-30212b2e28e5 and
timestamp: 2021-05-21T12:09:54.468000000Z:
Metric: Value:
dataflow_runtime_sec 0.0
dataflow_total_bytes_count -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid test results
at
org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:137)
at
org.apache.beam.sdk.loadtests.ParDoLoadTest.run(ParDoLoadTest.java:53)
at
org.apache.beam.sdk.loadtests.ParDoLoadTest.main(ParDoLoadTest.java:103)
> Task :sdks:java:testing:load-tests:run FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with
> non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 11m 3s
87 actionable tasks: 56 executed, 31 from cache
Publishing build scan...
Publishing build scan failed due to network error
'java.net.SocketTimeoutException: Read timed out' (2 retries remaining)...
https://gradle.com/s/euwqoacwdcm7i
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]