See
<https://builds.apache.org/job/beam_PostRelease_NightlySnapshot/30/display/redirect>
------------------------------------------
[...truncated 3.12 MB...]
INFO: Found inactive connection to /127.0.0.1:41080, creating a new one.
Feb 10, 2018 12:27:12 AM org.apache.spark.network.client.TransportClientFactory
createClient
INFO: Successfully created connection to /127.0.0.1:41080 after 1 ms (0 ms
spent in bootstraps)
Feb 10, 2018 12:27:12 AM org.apache.spark.internal.Logging$class logInfo
INFO: Fetching
spark://127.0.0.1:41080/jars/beam-sdks-java-io-google-cloud-platform-2.3.0-SNAPSHOT.jar
to
/tmp/spark-3ca36f4e-a26b-4eaa-8188-420f6cce4acd/userFiles-1450c31b-74f5-4bd6-8b80-f2548ef21f51/fetchFileTemp8917629938307745649.tmp
Feb 10, 2018 12:27:12 AM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Feb 10, 2018 12:27:12 AM
org.apache.spark.network.server.TransportRequestHandler lambda$respond$0
SEVERE: Error sending result
StreamResponse{streamId=/jars/beam-sdks-java-io-google-cloud-platform-2.3.0-SNAPSHOT.jar,
byteCount=526071,
body=FileSegmentManagedBuffer{file=/tmp/groovy-generated-3636326011548555434-tmpdir/.m2/repository/org/apache/beam/beam-sdks-java-io-google-cloud-platform/2.3.0-SNAPSHOT/beam-sdks-java-io-google-cloud-platform-2.3.0-SNAPSHOT.jar,
offset=0, length=526071}} to /127.0.0.1:58656; closing connection
java.lang.AbstractMethodError
at io.netty.util.ReferenceCountUtil.touch(ReferenceCountUtil.java:73)
at
io.netty.channel.DefaultChannelPipeline.touch(DefaultChannelPipeline.java:107)
at
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:811)
at
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:724)
at
io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:111)
at
io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:739)
at
io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:731)
at
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:817)
at
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:724)
at
io.netty.handler.timeout.IdleStateHandler.write(IdleStateHandler.java:305)
at
io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:739)
at
io.netty.channel.AbstractChannelHandlerContext.invokeWriteAndFlush(AbstractChannelHandlerContext.java:802)
at
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:815)
at
io.netty.channel.AbstractChannelHandlerContext.writeAndFlush(AbstractChannelHandlerContext.java:795)
at
io.netty.channel.AbstractChannelHandlerContext.writeAndFlush(AbstractChannelHandlerContext.java:832)
at
io.netty.channel.DefaultChannelPipeline.writeAndFlush(DefaultChannelPipeline.java:1032)
at
io.netty.channel.AbstractChannel.writeAndFlush(AbstractChannel.java:296)
at
org.apache.spark.network.server.TransportRequestHandler.respond(TransportRequestHandler.java:192)
at
org.apache.spark.network.server.TransportRequestHandler.processStreamRequest(TransportRequestHandler.java:148)
at
org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:109)
at
org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:363)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:349)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:341)
at
io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:363)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:349)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:341)
at
io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:363)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:349)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:341)
at
org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:363)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:349)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:341)
at
io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1334)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:363)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:349)
at
io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:926)
at
io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:129)
at
io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:642)
at
io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:565)
at
io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:479)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:441)
at
io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at
io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
at java.lang.Thread.run(Thread.java:748)
Feb 10, 2018 12:27:12 AM
org.apache.spark.network.client.TransportResponseHandler channelInactive
SEVERE: Still have 1 requests outstanding when connection from /127.0.0.1:41080
is closed
Feb 10, 2018 12:27:12 AM org.apache.spark.internal.Logging$class logInfo
INFO: Fetching
spark://127.0.0.1:41080/jars/beam-sdks-java-io-google-cloud-platform-2.3.0-SNAPSHOT.jar
with timestamp 1518222429814
Feb 10, 2018 12:27:12 AM org.apache.spark.internal.Logging$class logError
SEVERE: Exception in task 2.0 in stage 0.0 (TID 2)
java.io.IOException: Connection from /127.0.0.1:41080 closed
at
org.apache.spark.network.client.TransportResponseHandler.channelInactive(TransportResponseHandler.java:146)
at
org.apache.spark.network.server.TransportChannelHandler.channelInactive(TransportChannelHandler.java:108)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:246)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:232)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:225)
at
io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
at
io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:278)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:246)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:232)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:225)
at
io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:246)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:232)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:225)
at
io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
at
org.apache.spark.network.util.TransportFrameDecoder.channelInactive(TransportFrameDecoder.java:182)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:246)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:232)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:225)
at
io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1329)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:246)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:232)
at
io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:908)
at
io.netty.channel.AbstractChannel$AbstractUnsafe$7.run(AbstractChannel.java:744)
at
io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
at
io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:445)
at
io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at
io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
at java.lang.Thread.run(Thread.java:748)
Feb 10, 2018 12:27:12 AM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Feb 10, 2018 12:27:12 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Feb 10, 2018 12:27:12 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Feb 10, 2018 12:27:12 AM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Feb 10, 2018 12:27:12 AM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
[WARNING]
org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.io.IOException:
Connection from /127.0.0.1:41080 closed
at org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom
(SparkPipelineResult.java:68)
at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish
(SparkPipelineResult.java:99)
at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish
(SparkPipelineResult.java:87)
at org.apache.beam.examples.WordCount.main (WordCount.java:187)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke
(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke
(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:498)
at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:282)
at java.lang.Thread.run (Thread.java:748)
Caused by: java.io.IOException: Connection from /127.0.0.1:41080 closed
at org.apache.spark.network.client.TransportResponseHandler.channelInactive
(TransportResponseHandler.java:146)
at org.apache.spark.network.server.TransportChannelHandler.channelInactive
(TransportChannelHandler.java:108)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive
(AbstractChannelHandlerContext.java:246)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive
(AbstractChannelHandlerContext.java:232)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive
(AbstractChannelHandlerContext.java:225)
at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive
(ChannelInboundHandlerAdapter.java:75)
at io.netty.handler.timeout.IdleStateHandler.channelInactive
(IdleStateHandler.java:278)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive
(AbstractChannelHandlerContext.java:246)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive
(AbstractChannelHandlerContext.java:232)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive
(AbstractChannelHandlerContext.java:225)
at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive
(ChannelInboundHandlerAdapter.java:75)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive
(AbstractChannelHandlerContext.java:246)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive
(AbstractChannelHandlerContext.java:232)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive
(AbstractChannelHandlerContext.java:225)
at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive
(ChannelInboundHandlerAdapter.java:75)
at org.apache.spark.network.util.TransportFrameDecoder.channelInactive
(TransportFrameDecoder.java:182)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive
(AbstractChannelHandlerContext.java:246)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive
(AbstractChannelHandlerContext.java:232)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive
(AbstractChannelHandlerContext.java:225)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive
(DefaultChannelPipeline.java:1329)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive
(AbstractChannelHandlerContext.java:246)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive
(AbstractChannelHandlerContext.java:232)
at io.netty.channel.DefaultChannelPipeline.fireChannelInactive
(DefaultChannelPipeline.java:908)
at io.netty.channel.AbstractChannel$AbstractUnsafe$7.run
(AbstractChannel.java:744)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute
(AbstractEventExecutor.java:163)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks
(SingleThreadEventExecutor.java:403)
at io.netty.channel.nio.NioEventLoop.run (NioEventLoop.java:445)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run
(SingleThreadEventExecutor.java:858)
at
io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run
(DefaultThreadFactory.java:144)
at java.lang.Thread.run (Thread.java:748)
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:54 min
[INFO] Finished at: 2018-02-10T00:27:12Z
[INFO] Final Memory: 126M/788M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java
(default-cli) on project word-count-beam: An exception occured while executing
the Java class. java.io.IOException: Connection from /127.0.0.1:41080 closed ->
[Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please
read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
Feb 10, 2018 12:27:13 AM org.apache.spark.internal.Logging$class logError
SEVERE: Exception in task 0.0 in stage 0.0 (TID 0)
java.lang.IllegalStateException: Cannot retrieve files with 'spark' scheme
without an active SparkEnv.
at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:651)
at org.apache.spark.util.Utils$.fetchFile(Utils.scala:480)
at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:708)
at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:700)
at
scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at
scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
at
scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
at
org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:700)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:311)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
[ERROR] Failed command
:runners:spark:runQuickstartJavaSpark FAILED
Feb 10, 2018 12:27:32 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-02-10T00:27:32.040Z: (43e9af0e0d619693): Workers have started
successfully.
Feb 10, 2018 12:29:04 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-02-10T00:29:03.310Z: (e0a72014f4a63c2e): Executing operation
WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Close
Feb 10, 2018 12:29:04 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-02-10T00:29:03.378Z: (e0a72014f4a63ba6): Executing operation
WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Create
Feb 10, 2018 12:29:04 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-02-10T00:29:03.521Z: (e0a72014f4a634eb): Executing operation
WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Read+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract+MapElements/Map+WriteCounts/WriteFiles/RewindowIntoGlobal/Window.Assign+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow)+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write
Feb 10, 2018 12:29:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-02-10T00:29:16.896Z: (59ccf7193ac59c8b): Executing operation
WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Close
Feb 10, 2018 12:29:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-02-10T00:29:16.961Z: (e0a72014f4a63775): Executing operation
WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum+WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow)
Feb 10, 2018 12:29:21 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-02-10T00:29:19.627Z: (e0a72014f4a63e21): Executing operation s12-u31
Feb 10, 2018 12:29:21 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-02-10T00:29:19.856Z: (59ccf7193ac59c98): Executing operation
WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/CreateDataflowView
Feb 10, 2018 12:29:21 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-02-10T00:29:20.021Z: (472c5456b7c1d9e3): Executing operation
WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource)+WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)+WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map+WriteCounts/WriteFiles/FinalizeTempFileBundles/Finalize
Feb 10, 2018 12:29:22 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-02-10T00:29:21.990Z: (80e2bfb66d68f3e1): Cleaning up.
Feb 10, 2018 12:29:22 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-02-10T00:29:22.051Z: (80e2bfb66d68ffe7): Stopping worker pool...
Feb 10, 2018 12:31:35 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-02-10T00:31:35.389Z: (a0b02d63ba7bd7a4): Autoscaling: Resized worker
pool from 1 to 0.
Feb 10, 2018 12:31:38 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-02-10T00:31:35.414Z: (a0b02d63ba7bd162): Autoscaling: Would further
reduce the number of workers but reached the minimum number allowed for the job.
Feb 10, 2018 12:31:45 AM org.apache.beam.runners.dataflow.DataflowPipelineJob
waitUntilFinish
INFO: Job 2018-02-09_16_25_07-14846986513957444259 finished with status DONE.
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 07:53 min
[INFO] Finished at: 2018-02-10T00:31:45Z
[INFO] Final Memory: 50M/1098M
[INFO] ------------------------------------------------------------------------
gsutil cat gs://temp-storage-for-release-validation-tests/quickstart/count* |
grep Montague:
Montague: 47
Verified Montague: 47
gsutil rm gs://temp-storage-for-release-validation-tests/quickstart/count*
Removing
gs://temp-storage-for-release-validation-tests/quickstart/counts-00000-of-00003...
/ [1 objects]
Removing
gs://temp-storage-for-release-validation-tests/quickstart/counts-00001-of-00003...
/ [2 objects]
Removing
gs://temp-storage-for-release-validation-tests/quickstart/counts-00002-of-00003...
/ [3 objects]
Operation completed over 3 objects.
[SUCCESS]
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':runners:spark:runQuickstartJavaSpark'.
> Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java''
> finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 9m 2s
6 actionable tasks: 6 executed
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Not sending mail to unregistered user [email protected]