See 
<https://builds.apache.org/job/beam_PostRelease_NightlySnapshot/45/display/redirect>

------------------------------------------
[...truncated 3.14 MB...]
Feb 13, 2018 8:44:04 PM org.apache.spark.internal.Logging$class logInfo
INFO: Fetching spark://127.0.0.1:42408/jars/chill-java-0.8.0.jar with timestamp 
1518554640638
Feb 13, 2018 8:44:04 PM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://127.0.0.1:4040
Feb 13, 2018 8:44:04 PM org.apache.spark.internal.Logging$class logError
SEVERE: Exception in task 2.0 in stage 0.0 (TID 2)
java.io.IOException: Connection from /127.0.0.1:42408 closed
        at 
org.apache.spark.network.client.TransportResponseHandler.channelInactive(TransportResponseHandler.java:146)
        at 
org.apache.spark.network.server.TransportChannelHandler.channelInactive(TransportChannelHandler.java:108)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:246)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:232)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:225)
        at 
io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
        at 
io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:278)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:246)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:232)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:225)
        at 
io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:246)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:232)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:225)
        at 
io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
        at 
org.apache.spark.network.util.TransportFrameDecoder.channelInactive(TransportFrameDecoder.java:182)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:246)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:232)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:225)
        at 
io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1329)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:246)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:232)
        at 
io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:908)
        at 
io.netty.channel.AbstractChannel$AbstractUnsafe$7.run(AbstractChannel.java:744)
        at 
io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:445)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
        at 
io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
        at java.lang.Thread.run(Thread.java:748)

Feb 13, 2018 8:44:04 PM org.apache.spark.internal.Logging$class logInfo
INFO: Lost task 2.0 in stage 0.0 (TID 2) on localhost, executor driver: 
java.io.IOException (Connection from /127.0.0.1:42408 closed) [duplicate 2]
Feb 13, 2018 8:44:04 PM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Feb 13, 2018 8:44:04 PM org.apache.spark.network.client.TransportClientFactory 
createClient
INFO: Found inactive connection to /127.0.0.1:42408, creating a new one.
Feb 13, 2018 8:44:04 PM org.apache.spark.network.client.TransportClientFactory 
createClient
INFO: Successfully created connection to /127.0.0.1:42408 after 29 ms (0 ms 
spent in bootstraps)
Feb 13, 2018 8:44:04 PM org.apache.spark.internal.Logging$class logInfo
INFO: Fetching spark://127.0.0.1:42408/jars/chill-java-0.8.0.jar to 
/tmp/spark-7667752e-4151-4200-b918-edecb77a6aa4/userFiles-3b7d76b4-9e6c-4753-b027-5fe77a7949f2/fetchFileTemp2334279996979975636.tmp
Feb 13, 2018 8:44:04 PM org.apache.spark.network.server.TransportRequestHandler 
lambda$respond$0
SEVERE: Error sending result 
StreamResponse{streamId=/jars/chill-java-0.8.0.jar, byteCount=50619, 
body=FileSegmentManagedBuffer{file=/tmp/groovy-generated-5233139778764801241-tmpdir/.m2/repository/com/twitter/chill-java/0.8.0/chill-java-0.8.0.jar,
 offset=0, length=50619}} to /127.0.0.1:36798; closing connection
java.lang.AbstractMethodError
        at io.netty.util.ReferenceCountUtil.touch(ReferenceCountUtil.java:73)
        at 
io.netty.channel.DefaultChannelPipeline.touch(DefaultChannelPipeline.java:107)
        at 
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:811)
        at 
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:724)
        at 
io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:111)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:739)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:731)
        at 
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:817)
        at 
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:724)
        at 
io.netty.handler.timeout.IdleStateHandler.write(IdleStateHandler.java:305)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:739)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeWriteAndFlush(AbstractChannelHandlerContext.java:802)
        at 
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:815)
        at 
io.netty.channel.AbstractChannelHandlerContext.writeAndFlush(AbstractChannelHandlerContext.java:795)
        at 
io.netty.channel.AbstractChannelHandlerContext.writeAndFlush(AbstractChannelHandlerContext.java:832)
        at 
io.netty.channel.DefaultChannelPipeline.writeAndFlush(DefaultChannelPipeline.java:1032)
        at 
io.netty.channel.AbstractChannel.writeAndFlush(AbstractChannel.java:296)
        at 
org.apache.spark.network.server.TransportRequestHandler.respond(TransportRequestHandler.java:192)
        at 
org.apache.spark.network.server.TransportRequestHandler.processStreamRequest(TransportRequestHandler.java:148)
        at 
org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:109)
        at 
org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:363)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:349)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:341)
        at 
io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:363)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:349)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:341)
        at 
io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:363)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:349)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:341)
        at 
org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:363)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:349)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:341)
        at 
io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1334)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:363)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:349)
        at 
io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:926)
        at 
io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:129)
        at 
io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:642)
        at 
io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:565)
        at 
io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:479)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:441)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
        at 
io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
        at java.lang.Thread.run(Thread.java:748)

Feb 13, 2018 8:44:04 PM 
org.apache.spark.network.client.TransportResponseHandler channelInactive
SEVERE: Still have 1 requests outstanding when connection from /127.0.0.1:42408 
is closed
Feb 13, 2018 8:44:04 PM org.apache.spark.internal.Logging$class logError
SEVERE: Exception in task 1.0 in stage 0.0 (TID 1)
java.io.IOException: Connection from /127.0.0.1:42408 closed
        at 
org.apache.spark.network.client.TransportResponseHandler.channelInactive(TransportResponseHandler.java:146)
        at 
org.apache.spark.network.server.TransportChannelHandler.channelInactive(TransportChannelHandler.java:108)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:246)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:232)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:225)
        at 
io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
        at 
io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:278)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:246)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:232)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:225)
        at 
io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:246)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:232)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:225)
        at 
io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
        at 
org.apache.spark.network.util.TransportFrameDecoder.channelInactive(TransportFrameDecoder.java:182)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:246)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:232)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:225)
        at 
io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1329)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:246)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:232)
        at 
io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:908)
        at 
io.netty.channel.AbstractChannel$AbstractUnsafe$7.run(AbstractChannel.java:744)
        at 
io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:445)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
        at 
io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
        at java.lang.Thread.run(Thread.java:748)

Feb 13, 2018 8:44:04 PM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Feb 13, 2018 8:44:04 PM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Feb 13, 2018 8:44:04 PM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Feb 13, 2018 8:44:04 PM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Feb 13, 2018 8:44:04 PM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
[WARNING] 
org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.io.IOException: 
Connection from /127.0.0.1:42408 closed
    at org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom 
(SparkPipelineResult.java:68)
    at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish 
(SparkPipelineResult.java:99)
    at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish 
(SparkPipelineResult.java:87)
    at org.apache.beam.examples.WordCount.main (WordCount.java:187)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke 
(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke 
(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:282)
    at java.lang.Thread.run (Thread.java:748)
Caused by: java.io.IOException: Connection from /127.0.0.1:42408 closed
    at org.apache.spark.network.client.TransportResponseHandler.channelInactive 
(TransportResponseHandler.java:146)
    at org.apache.spark.network.server.TransportChannelHandler.channelInactive 
(TransportChannelHandler.java:108)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive 
(AbstractChannelHandlerContext.java:246)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive 
(AbstractChannelHandlerContext.java:232)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive 
(AbstractChannelHandlerContext.java:225)
    at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive 
(ChannelInboundHandlerAdapter.java:75)
    at io.netty.handler.timeout.IdleStateHandler.channelInactive 
(IdleStateHandler.java:278)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive 
(AbstractChannelHandlerContext.java:246)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive 
(AbstractChannelHandlerContext.java:232)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive 
(AbstractChannelHandlerContext.java:225)
    at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive 
(ChannelInboundHandlerAdapter.java:75)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive 
(AbstractChannelHandlerContext.java:246)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive 
(AbstractChannelHandlerContext.java:232)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive 
(AbstractChannelHandlerContext.java:225)
    at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive 
(ChannelInboundHandlerAdapter.java:75)
    at org.apache.spark.network.util.TransportFrameDecoder.channelInactive 
(TransportFrameDecoder.java:182)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive 
(AbstractChannelHandlerContext.java:246)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive 
(AbstractChannelHandlerContext.java:232)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive 
(AbstractChannelHandlerContext.java:225)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive 
(DefaultChannelPipeline.java:1329)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive 
(AbstractChannelHandlerContext.java:246)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive 
(AbstractChannelHandlerContext.java:232)
    at io.netty.channel.DefaultChannelPipeline.fireChannelInactive 
(DefaultChannelPipeline.java:908)
    at io.netty.channel.AbstractChannel$AbstractUnsafe$7.run 
(AbstractChannel.java:744)
    at io.netty.util.concurrent.AbstractEventExecutor.safeExecute 
(AbstractEventExecutor.java:163)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks 
(SingleThreadEventExecutor.java:403)
    at io.netty.channel.nio.NioEventLoop.run (NioEventLoop.java:445)
    at io.netty.util.concurrent.SingleThreadEventExecutor$5.run 
(SingleThreadEventExecutor.java:858)
    at 
io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run 
(DefaultThreadFactory.java:144)
    at java.lang.Thread.run (Thread.java:748)
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:59 min
[INFO] Finished at: 2018-02-13T20:44:04Z
[INFO] Final Memory: 141M/931M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java 
(default-cli) on project word-count-beam: An exception occured while executing 
the Java class. java.io.IOException: Connection from /127.0.0.1:42408 closed -> 
[Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed command: 

:runners:spark:runQuickstartJavaSpark FAILED
Feb 13, 2018 8:45:58 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-02-13T20:45:57.920Z: (1f93fa8c5ba4f5de): Autoscaling: Resized worker 
pool from 1 to 0.
Feb 13, 2018 8:45:58 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-02-13T20:45:57.939Z: (1f93fa8c5ba4f960): Autoscaling: Would further 
reduce the number of workers but reached the minimum number allowed for the job.
Feb 13, 2018 8:46:06 PM org.apache.beam.runners.dataflow.DataflowPipelineJob 
waitUntilFinish
INFO: Job 2018-02-13_12_41_46-14768684759396237926 finished with status DONE.
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 05:38 min
[INFO] Finished at: 2018-02-13T20:46:06Z
[INFO] Final Memory: 55M/622M
[INFO] ------------------------------------------------------------------------
gsutil cat gs://temp-storage-for-release-validation-tests/quickstart/count* | 
grep Montague:
Montague: 47
Verified Montague: 47
gsutil rm gs://temp-storage-for-release-validation-tests/quickstart/count*
Removing 
gs://temp-storage-for-release-validation-tests/quickstart/counts-00000-of-00003...
/ [1 objects]                                                                   
Removing 
gs://temp-storage-for-release-validation-tests/quickstart/counts-00001-of-00003...
/ [2 objects]                                                                   
Removing 
gs://temp-storage-for-release-validation-tests/quickstart/counts-00002-of-00003...
/ [3 objects]                                                                   
Operation completed over 3 objects.                                             
 
[SUCCESS]

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':runners:spark:runQuickstartJavaSpark'.
> Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java'' 
> finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 6m 37s
6 actionable tasks: 6 executed
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Not sending mail to unregistered user kirpic...@google.com
Not sending mail to unregistered user xuming...@users.noreply.github.com

Reply via email to