See
<https://builds.apache.org/job/beam_PostRelease_NightlySnapshot/160/display/redirect?page=changes>
Changes:
[grzegorz.kolakowski] [BEAM-3800] Set uids on Flink operators
[github] Updated to Ubuntu 16 version of python 2
[echauchot] [BEAM-3892] Make MetricQueryResults and related classes more
[iemejia] [BEAM-3931] Remove commons-text dependency from Spark runner
[ehudm] Fix test_pre_finalize_error to test exceptions.
[aljoscha.krettek] [BEAM-622] Add checkpointing tests for DoFnOperator and
[aljoscha.krettek] [BEAM-3087] Make reader state update and element emission
atomic
[aljoscha.krettek] [BEAM-2393] Make BoundedSource fault-tolerant
------------------------------------------
[...truncated 3.03 MB...]
INFO: Starting task 7.0 in stage 2.0 (TID 15, localhost, executor driver,
partition 7, PROCESS_LOCAL, 4730 bytes)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 2.0 (TID 10) in 93 ms on localhost (executor
driver) (2/8)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 2.0 (TID 8) in 101 ms on localhost (executor
driver) (3/8)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 2.0 (TID 11) in 93 ms on localhost (executor
driver) (4/8)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 7.0 in stage 2.0 (TID 15)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks out of 4 blocks
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks out of 4 blocks
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks out of 4 blocks
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks out of 4 blocks
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block rdd_41_4 stored as values in memory (estimated size 16.0 B, free
1825.1 MB)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block rdd_41_5 stored as values in memory (estimated size 16.0 B, free
1825.1 MB)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added rdd_41_4 in memory on 127.0.0.1:35195 (size: 16.0 B, free: 1825.4
MB)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block rdd_41_7 stored as values in memory (estimated size 16.0 B, free
1825.1 MB)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added rdd_41_5 in memory on 127.0.0.1:35195 (size: 16.0 B, free: 1825.4
MB)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added rdd_41_7 in memory on 127.0.0.1:35195 (size: 16.0 B, free: 1825.4
MB)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block rdd_41_6 stored as values in memory (estimated size 16.0 B, free
1825.1 MB)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 5.0 in stage 2.0 (TID 13). 11942 bytes result sent to driver
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added rdd_41_6 in memory on 127.0.0.1:35195 (size: 16.0 B, free: 1825.4
MB)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 5.0 in stage 2.0 (TID 13) in 80 ms on localhost (executor
driver) (5/8)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 6.0 in stage 2.0 (TID 14). 11942 bytes result sent to driver
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 4.0 in stage 2.0 (TID 12). 11942 bytes result sent to driver
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 6.0 in stage 2.0 (TID 14) in 80 ms on localhost (executor
driver) (6/8)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 7.0 in stage 2.0 (TID 15). 11942 bytes result sent to driver
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 4.0 in stage 2.0 (TID 12) in 98 ms on localhost (executor
driver) (7/8)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 7.0 in stage 2.0 (TID 15) in 80 ms on localhost (executor
driver) (8/8)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 2.0, whose tasks have all completed, from pool
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 2 (collect at BoundedDataset.java:87) finished in 0.179 s
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 0 finished: collect at BoundedDataset.java:87, took 3.604269 s
Mar 27, 2018 11:04:18 AM org.apache.beam.runners.spark.SparkRunner$Evaluator
enterCompositeTransform
INFO: Entering directly-translatable composite transform:
'WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values'
Mar 27, 2018 11:04:18 AM org.apache.beam.runners.spark.SparkRunner$Evaluator
doVisitTransform
INFO: Evaluating Create.Values
Mar 27, 2018 11:04:18 AM org.apache.beam.runners.spark.SparkRunner$Evaluator
doVisitTransform
INFO: Evaluating org.apache.beam.sdk.transforms.Reify$ReifyView$1@1850defc
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_3 stored as values in memory (estimated size 672.0 B,
free 1825.1 MB)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_3_piece0 stored as bytes in memory (estimated size 361.0
B, free 1825.1 MB)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_3_piece0 in memory on 127.0.0.1:35195 (size: 361.0 B,
free: 1825.4 MB)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 3 from broadcast at SideInputBroadcast.java:59
Mar 27, 2018 11:04:18 AM org.apache.beam.runners.spark.SparkRunner$Evaluator
doVisitTransform
INFO: Evaluating org.apache.beam.sdk.transforms.MapElements$1@4ea286f6
Mar 27, 2018 11:04:18 AM org.apache.beam.runners.spark.SparkRunner$Evaluator
doVisitTransform
INFO: Evaluating
org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn@4110ba55
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting job: foreach at BoundedDataset.java:117
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Got job 1 (foreach at BoundedDataset.java:117) with 4 output partitions
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Final stage: ResultStage 3 (foreach at BoundedDataset.java:117)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Parents of final stage: List()
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Missing parents: List()
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ResultStage 3
(WriteCounts/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize).output
MapPartitionsRDD[53] at values at TransformTranslator.java:400), which has no
missing parents
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_4 stored as values in memory (estimated size 84.1 KB,
free 1825.0 MB)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_4_piece0 stored as bytes in memory (estimated size 22.4
KB, free 1825.0 MB)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_4_piece0 in memory on 127.0.0.1:35195 (size: 22.4 KB,
free: 1825.3 MB)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 4 from broadcast at DAGScheduler.scala:1006
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ResultStage 3
(WriteCounts/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize).output
MapPartitionsRDD[53] at values at TransformTranslator.java:400) (first 15
tasks are for partitions Vector(0, 1, 2, 3))
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 3.0 with 4 tasks
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 3.0 (TID 16, localhost, executor driver,
partition 0, PROCESS_LOCAL, 4827 bytes)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 3.0 (TID 17, localhost, executor driver,
partition 1, PROCESS_LOCAL, 4827 bytes)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 3.0 (TID 18, localhost, executor driver,
partition 2, PROCESS_LOCAL, 4827 bytes)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 3.0 (TID 19, localhost, executor driver,
partition 3, PROCESS_LOCAL, 4837 bytes)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 3.0 (TID 16)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 3.0 (TID 17)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 3.0 (TID 18)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 3.0 (TID 19)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 3.0 (TID 17). 12184 bytes result sent to driver
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 3.0 (TID 16). 12141 bytes result sent to driver
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 3.0 (TID 16) in 58 ms on localhost (executor
driver) (1/4)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 3.0 (TID 17) in 57 ms on localhost (executor
driver) (2/4)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 3.0 (TID 18). 12141 bytes result sent to driver
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 3.0 (TID 18) in 60 ms on localhost (executor
driver) (3/4)
Mar 27, 2018 11:04:18 AM
org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn process
INFO: Finalizing 4 file results
Mar 27, 2018 11:04:18 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
createMissingEmptyShards
INFO: Finalizing for destination null num shards 4.
Mar 27, 2018 11:04:18 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
moveToOutputFiles
INFO: Will copy temporary file
FileResult{tempFilename=/tmp/groovy-generated-5648081637490606715-tmpdir/word-count-beam/.temp-beam-2018-03-27_11-04-11-0/e1669ace-eea2-4499-aa4b-6792e28252cd,
shard=0,
window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@5f0c6fdc,
paneInfo=PaneInfo.NO_FIRING} to final location
/tmp/groovy-generated-5648081637490606715-tmpdir/word-count-beam/counts-00000-of-00004
Mar 27, 2018 11:04:18 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
moveToOutputFiles
INFO: Will copy temporary file
FileResult{tempFilename=/tmp/groovy-generated-5648081637490606715-tmpdir/word-count-beam/.temp-beam-2018-03-27_11-04-11-0/579b6028-42d8-4c84-85bc-08aef5652590,
shard=1,
window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@5f0c6fdc,
paneInfo=PaneInfo.NO_FIRING} to final location
/tmp/groovy-generated-5648081637490606715-tmpdir/word-count-beam/counts-00001-of-00004
Mar 27, 2018 11:04:18 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
moveToOutputFiles
INFO: Will copy temporary file
FileResult{tempFilename=/tmp/groovy-generated-5648081637490606715-tmpdir/word-count-beam/.temp-beam-2018-03-27_11-04-11-0/e0694245-7245-4a81-9e05-2fc37a5ddbb3,
shard=2,
window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@5f0c6fdc,
paneInfo=PaneInfo.NO_FIRING} to final location
/tmp/groovy-generated-5648081637490606715-tmpdir/word-count-beam/counts-00002-of-00004
Mar 27, 2018 11:04:18 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
moveToOutputFiles
INFO: Will copy temporary file
FileResult{tempFilename=/tmp/groovy-generated-5648081637490606715-tmpdir/word-count-beam/.temp-beam-2018-03-27_11-04-11-0/3a61bf97-474f-4b94-8fa5-398416b6027b,
shard=3,
window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@5f0c6fdc,
paneInfo=PaneInfo.NO_FIRING} to final location
/tmp/groovy-generated-5648081637490606715-tmpdir/word-count-beam/counts-00003-of-00004
Mar 27, 2018 11:04:18 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
removeTemporaryFiles
INFO: Will remove known temporary file
/tmp/groovy-generated-5648081637490606715-tmpdir/word-count-beam/.temp-beam-2018-03-27_11-04-11-0/3a61bf97-474f-4b94-8fa5-398416b6027b
Mar 27, 2018 11:04:18 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
removeTemporaryFiles
INFO: Will remove known temporary file
/tmp/groovy-generated-5648081637490606715-tmpdir/word-count-beam/.temp-beam-2018-03-27_11-04-11-0/e1669ace-eea2-4499-aa4b-6792e28252cd
Mar 27, 2018 11:04:18 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
removeTemporaryFiles
INFO: Will remove known temporary file
/tmp/groovy-generated-5648081637490606715-tmpdir/word-count-beam/.temp-beam-2018-03-27_11-04-11-0/e0694245-7245-4a81-9e05-2fc37a5ddbb3
Mar 27, 2018 11:04:18 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation
removeTemporaryFiles
INFO: Will remove known temporary file
/tmp/groovy-generated-5648081637490606715-tmpdir/word-count-beam/.temp-beam-2018-03-27_11-04-11-0/579b6028-42d8-4c84-85bc-08aef5652590
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 3.0 (TID 19). 15889 bytes result sent to driver
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 3.0 (TID 19) in 132 ms on localhost (executor
driver) (4/4)
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 3.0, whose tasks have all completed, from pool
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 3 (foreach at BoundedDataset.java:117) finished in 0.140 s
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 1 finished: foreach at BoundedDataset.java:117, took 0.166562 s
Mar 27, 2018 11:04:18 AM org.apache.beam.runners.spark.SparkRunner lambda$run$1
INFO: Batch pipeline execution complete.
Mar 27, 2018 11:04:18 AM org.spark_project.jetty.server.AbstractConnector doStop
INFO: Stopped Spark@14688bb0{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://127.0.0.1:4040
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Mar 27, 2018 11:04:18 AM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:45 min
[INFO] Finished at: 2018-03-27T11:04:18Z
[INFO] Final Memory: 98M/1387M
[INFO] ------------------------------------------------------------------------
grep Foundation counts*
counts-00000-of-00004:Foundation: 1
Verified Foundation: 1
[SUCCESS]
Mar 27, 2018 12:07:05 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2018-03-27T12:07:04.932Z: Workflow failed. Causes: The Dataflow appears
to be stuck. You can get help with Cloud Dataflow at
https://cloud.google.com/dataflow/support.
Mar 27, 2018 12:07:05 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-27T12:07:05.056Z: Cancel request is committed for workflow job:
2018-03-27_04_02_24-5362352947987062113.
Mar 27, 2018 12:07:05 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-27T12:07:05.177Z: Cleaning up.
Mar 27, 2018 12:07:05 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-27T12:07:05.244Z: Stopping worker pool...
Mar 27, 2018 12:08:24 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-27T12:08:23.410Z: Autoscaling: Reduced the number of workers to 0
based on the rate of progress in the currently running step(s).
Mar 27, 2018 12:08:31 PM org.apache.beam.runners.dataflow.DataflowPipelineJob
waitUntilFinish
INFO: Job 2018-03-27_04_02_24-5362352947987062113 failed with status FAILED.
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:07 h
[INFO] Finished at: 2018-03-27T12:08:31Z
[INFO] Final Memory: 37M/295M
[INFO] ------------------------------------------------------------------------
gsutil cat gs://temp-storage-for-release-validation-tests/quickstart/count* |
grep Montague:
CommandException: No URLs matched:
gs://temp-storage-for-release-validation-tests/quickstart/count*
CommandException: No URLs matched:
gs://temp-storage-for-release-validation-tests/quickstart/count*
[ERROR] Failed command
:runners:google-cloud-dataflow-java:runQuickstartJavaDataflow FAILED
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:flink:runQuickstartJavaFlinkLocal'.
> Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java''
> finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task
':runners:google-cloud-dataflow-java:runQuickstartJavaDataflow'.
> Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java''
> finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
BUILD FAILED in 1h 8m 16s
6 actionable tasks: 6 executed
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]