See
<https://ci-beam.apache.org/job/beam_PostCommit_Python38/2041/display/redirect?page=changes>
Changes:
[stranniknm] [BEAM-13423]: fix frontend failure if no examples
[daria.malkova] change return type of 2 methods
[mmack] [BEAM-13441] Use quiet delete for S3 batch deletes. In quiet mode only
[daria.malkova] Docs for validators tests
[daria.malkova] change context type
[noreply] Merge pull request #16140 from [BEAM-13377][Playground] Update CI/CD
[noreply] Merge pull request #16120 from [BEAM-13333][Playground] Save Python
logs
[noreply] Merge pull request #16185 from [BEAM-13425][Playground][Bugfix]
Support
[mmack] [BEAM-13445] Correctly set data limit when flushing S3 upload buffer and
[noreply] Merge pull request #16121 from [BEAM-13334][Playground] Save Go logs
to
[noreply] Merge pull request #16179 from [BEAM-13344][Playground] support python
[noreply] Merge pull request #16208 from [BEAM-13442][Playground] Filepath to
log
[noreply] [BEAM-13276] bump jackson-core to 2.13.0 for .test-infra (#16062)
------------------------------------------
[...truncated 19.92 MB...]
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/data_plane.py:750"
thread: "Thread-14"
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
(((((ref_AppliedPTransform_add-points-Impulse_3)+(ref_AppliedPTransform_add-points-FlatMap-lambda-at-core-py-3224-_4))+(ref_AppliedPTransform_add-points-Map-decode-_6))+(ref_AppliedPTransform_Map-get_julia_set_point_color-_7))+(ref_AppliedPTransform_x-coord-key_8))+(x
coord/Write)
INFO:root:Running
(((((ref_AppliedPTransform_add-points-Impulse_3)+(ref_AppliedPTransform_add-points-FlatMap-lambda-at-core-py-3224-_4))+(ref_AppliedPTransform_add-points-Map-decode-_6))+(ref_AppliedPTransform_Map-get_julia_set_point_color-_7))+(ref_AppliedPTransform_x-coord-key_8))+(x
coord/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((x
coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:root:Running (((((x
coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:root:Running
((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:root:Running
((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
(ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:Running
(ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:severity: INFO
timestamp {
seconds: 1639419191
nanos: 248365640
}
message: "Starting finalize_write threads with num_shards: 1 (skipped: 0),
batches: 1, num_threads: 1"
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/io/filebasedsink.py:297"
thread: "Thread-14"
INFO:root:severity: INFO
timestamp {
seconds: 1639419191
nanos: 257539033
}
message: "Renamed 1 shards in 0.01 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/io/filebasedsink.py:345"
thread: "Thread-14"
INFO:root:severity: INFO
timestamp {
seconds: 1639419191
nanos: 267343044
}
message: "No more requests from control plane"
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:244"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1639419191
nanos: 267580270
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:245"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1639419191
nanos: 267664432
}
message: "Closing all cached grpc data channels."
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/data_plane.py:782"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1639419191
nanos: 267736911
}
message: "Closing all cached gRPC state handlers."
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:826"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1639419191
nanos: 269053459
}
message: "Done consuming work."
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:257"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1639419191
nanos: 269222497
}
message: "Python sdk harness exiting."
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker_main.py:156"
thread: "MainThread"
INFO:apache_beam.runners.portability.local_job_service:Successfully completed
job in 6.697733640670776 seconds.
INFO:root:Successfully completed job in 6.697733640670776 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
> Task :sdks:python:test-suites:portable:py38:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at
localhost:45581
WARNING:root:Make sure that locally built Python SDK docker image has Python
3.8 interpreter.
INFO:root:Default Python SDK image for environment is
apache/beam_python3.8_sdk:2.36.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
<function pack_combiners at 0x7f67fb6ccd30> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
<function lift_combiners at 0x7f67fb6ccdc0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
<function sort_stages at 0x7f67fb6cd550> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar'
'<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.36.0-SNAPSHOT.jar'>
'--spark-master-url' 'local[4]' '--artifacts-dir'
'/tmp/beam-temp1lshyo_9/artifacts0tkwdm4n' '--job-port' '33451'
'--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:16 INFO
org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService
started on localhost:45441'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:16 INFO
org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService
started on localhost:42965'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:16 INFO
org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on
localhost:33451'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:16 INFO
org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running,
terminate with Ctrl+C'
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args:
['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:17 INFO
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging
artifacts for job_2e61cc82-86a4-42d4-8bd8-8b87eb788656.'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:17 INFO
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving
artifacts for
job_2e61cc82-86a4-42d4-8bd8-8b87eb788656.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:17 INFO
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1
artifacts for job_2e61cc82-86a4-42d4-8bd8-8b87eb788656.null.'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:17 INFO
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts
fully staged for job_2e61cc82-86a4-42d4-8bd8-8b87eb788656.'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:17 INFO
org.apache.beam.runners.spark.SparkJobInvoker: Invoking job
BeamApp-jenkins-1213181317-2c215dd9_134e6dce-ea03-443c-b6e0-ab17b4a82de9'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:18 INFO
org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation
BeamApp-jenkins-1213181317-2c215dd9_134e6dce-ea03-443c-b6e0-ab17b4a82de9'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has
started a component necessary for the execution. Be sure to run the pipeline
using
with Pipeline() as p:
p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to
RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:18 INFO
org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand
new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:18 WARN
org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library
for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:19 INFO
org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated
aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:19 INFO
org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics
accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:19 INFO
org.apache.beam.runners.spark.SparkPipelineRunner: Running job
BeamApp-jenkins-1213181317-2c215dd9_134e6dce-ea03-443c-b6e0-ab17b4a82de9 on
Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel
for localhost:35255.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with
unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:20 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam
Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:20 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for
localhost:35039.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for
localhost:46483
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:20 INFO
org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client
connected.'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:21 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:21 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:21 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:21 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:21 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:21 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:21 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:21 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:21 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:21 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:21 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:21 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:21 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:21 INFO
org.apache.beam.runners.spark.SparkPipelineRunner: Job
BeamApp-jenkins-1213181317-2c215dd9_134e6dce-ea03-443c-b6e0-ab17b4a82de9:
Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:21 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with
num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.01 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/12/13 18:13:21 INFO
org.apache.beam.runners.spark.SparkPipelineRunner: Job
BeamApp-jenkins-1213181317-2c215dd9_134e6dce-ea03-443c-b6e0-ab17b4a82de9
finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
Exception in thread run_worker_1-1:
Traceback (most recent call last):
File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
Exception in thread read_state:
Traceback (most recent call last):
File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
self.run()
File "/usr/lib/python3.8/threading.py", line 870, in run
self._target(*self._args, **self._kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",>
line 234, in run
self.run()
File "/usr/lib/python3.8/threading.py", line 870, in run
for work_request in self._control_stub.Control(get_responses()):
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",>
line 426, in __next__
return self._next()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",>
line 826, in _next
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data
plane.
Traceback (most recent call last):
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 634, in _read_inputs
for elements in elements_iterator:
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",>
line 426, in __next__
return self._next()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",>
line 826, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that
terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string =
"{"created":"@1639419202.445304109","description":"Error received from peer
ipv4:127.0.0.1:46483","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket
closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
self.run()
File "/usr/lib/python3.8/threading.py", line 870, in run
self._target(*self._args, **self._kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",>
line 957, in pull_responses
self._target(*self._args, **self._kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 651, in <lambda>
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that
terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string =
"{"created":"@1639419202.445438760","description":"Error received from peer
ipv4:127.0.0.1:35255","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket
closed","grpc_status":14}"
>
target=lambda: self._read_inputs(elements_iterator),
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 634, in _read_inputs
for elements in elements_iterator:
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",>
line 426, in __next__
for response in responses:
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",>
line 426, in __next__
return self._next()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",>
line 826, in _next
return self._next()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",>
line 826, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that
terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string =
"{"created":"@1639419202.445427356","description":"Error received from peer
ipv4:127.0.0.1:35039","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket
closed","grpc_status":14}"
>
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that
terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string =
"{"created":"@1639419202.445304109","description":"Error received from peer
ipv4:127.0.0.1:46483","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket
closed","grpc_status":14}"
>
> Task :sdks:python:test-suites:portable:py38:postCommitPy38
> Task :sdks:python:test-suites:dataflow:py38:postCommitIT FAILED
FAILURE: Build failed with an exception.
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
line: 120
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 2h 10m 16s
212 actionable tasks: 151 executed, 57 from cache, 4 up-to-date
Publishing build scan...
https://gradle.com/s/z4y6yx6twyfyw
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]