hi there, I'm new bee for Spark, recently beginning my learning journey come with spark 2.0.1. I hit an issue maybe totally simple. When trying to run SparkPi example in Scala in following command, an exception was thrown. Is it right behavior or something wrong in my command?
# bin/spark-submit --master spark://spark-49:6066 --deploy-mode client --class org.apache.spark.examples.SparkPi examples/jars/spark-examples_2.11-2.0.1.jar 10000 Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 16/10/14 01:44:59 INFO SparkContext: Running Spark version 2.0.1 16/10/14 01:45:00 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 16/10/14 01:45:00 INFO SecurityManager: Changing view acls to: root 16/10/14 01:45:00 INFO SecurityManager: Changing modify acls to: root 16/10/14 01:45:00 INFO SecurityManager: Changing view acls groups to: 16/10/14 01:45:00 INFO SecurityManager: Changing modify acls groups to: 16/10/14 01:45:00 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 16/10/14 01:45:00 INFO Utils: Successfully started service 'sparkDriver' on port 42464. 16/10/14 01:45:00 INFO SparkEnv: Registering MapOutputTracker 16/10/14 01:45:00 INFO SparkEnv: Registering BlockManagerMaster 16/10/14 01:45:00 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-f8c0a89d-cbf6-4aec-9054-da6e35999be5 16/10/14 01:45:00 INFO MemoryStore: MemoryStore started with capacity 434.4 MB 16/10/14 01:45:00 INFO SparkEnv: Registering OutputCommitCoordinator 16/10/14 01:45:00 INFO Utils: Successfully started service 'SparkUI' on port 4040. 16/10/14 01:45:00 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://172.29.10.49:4040 16/10/14 01:45:00 INFO SparkContext: Added JAR file:/opt/spark-2.0.1-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.0.1.jar at spark://172.29.10.49:42464/jars/spark-examples_2.11-2.0.1.jar with timestamp 1476423900827 16/10/14 01:45:00 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://spark-49:6066... 16/10/14 01:45:00 INFO TransportClientFactory: Successfully created connection to spark-49/172.29.10.49:6066 after 21 ms (0 ms spent in bootstraps) 16/10/14 01:45:01 ERROR TransportResponseHandler: Still have 1 requests outstanding when connection from spark-49/172.29.10.49:6066 is closed 16/10/14 01:45:01 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master spark-49:6066 org.apache.spark.SparkException: Exception thrown in awaitResult at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77) at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75) at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36) at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) at scala.PartialFunction$OrElse.apply(PartialFunction.scala:167) at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83) at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:88) at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96) at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:106) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:482) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:345) at java.util.concurrent.FutureTask.run(FutureTask.java:177) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1156) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:626) at java.lang.Thread.run(Thread.java:780) Caused by: java.io.IOException: Connection from spark-49/172.29.10.49:6066 closed at org.apache.spark.network.client.TransportResponseHandler.channelInactive(TransportResponseHandler.java:128) at org.apache.spark.network.server.TransportChannelHandler.channelInactive(TransportChannelHandler.java:109) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:208) at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:194) at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75) at io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:257) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:208) at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:194) at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:208) at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:194) at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75) at org.apache.spark.network.util.TransportFrameDecoder.channelInactive(TransportFrameDecoder.java:182) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:208) at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:194) at io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:828) at io.netty.channel.AbstractChannel$AbstractUnsafe$7.run(AbstractChannel.java:621) at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) ... 1 more 16/10/14 01:45:20 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://spark-49:6066... 16/10/14 01:45:20 INFO TransportClientFactory: Found inactive connection to spark-49/172.29.10.49:6066, creating a new one. 16/10/14 01:45:20 INFO TransportClientFactory: Successfully created connection to spark-49/172.29.10.49:6066 after 1 ms (0 ms spent in bootstraps) 16/10/14 01:45:20 ERROR TransportResponseHandler: Still have 1 requests outstanding when connection from spark-49/172.29.10.49:6066 is closed 16/10/14 01:45:20 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master spark-49:6066 org.apache.spark.SparkException: Exception thrown in awaitResult at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77) at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75) at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36) at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) at scala.PartialFunction$OrElse.apply(PartialFunction.scala:167) at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83) at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:88) at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96) at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:106) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:482) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:345) at java.util.concurrent.FutureTask.run(FutureTask.java:177) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1156) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:626) at java.lang.Thread.run(Thread.java:780) Caused by: java.io.IOException: Connection from spark-49/172.29.10.49:6066 closed at org.apache.spark.network.client.TransportResponseHandler.channelInactive(TransportResponseHandler.java:128) at org.apache.spark.network.server.TransportChannelHandler.channelInactive(TransportChannelHandler.java:109) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:208) at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:194) at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75) at io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:257) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:208) at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:194) at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:208) at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:194) at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75) at org.apache.spark.network.util.TransportFrameDecoder.channelInactive(TransportFrameDecoder.java:182) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:208) at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:194) at io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:828) at io.netty.channel.AbstractChannel$AbstractUnsafe$7.run(AbstractChannel.java:621) at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) ... 1 more 16/10/14 01:45:40 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://spark-49:6066... 16/10/14 01:45:40 INFO TransportClientFactory: Found inactive connection to spark-49/172.29.10.49:6066, creating a new one. 16/10/14 01:45:40 INFO TransportClientFactory: Successfully created connection to spark-49/172.29.10.49:6066 after 3 ms (0 ms spent in bootstraps) 16/10/14 01:45:40 ERROR TransportResponseHandler: Still have 1 requests outstanding when connection from spark-49/172.29.10.49:6066 is closed 16/10/14 01:45:40 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master spark-49:6066 org.apache.spark.SparkException: Exception thrown in awaitResult at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77) at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75) at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36) at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) at scala.PartialFunction$OrElse.apply(PartialFunction.scala:167) at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83) at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:88) at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96) at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:106) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:482) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:345) at java.util.concurrent.FutureTask.run(FutureTask.java:177) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1156) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:626) at java.lang.Thread.run(Thread.java:780) Caused by: java.io.IOException: Connection from spark-49/172.29.10.49:6066 closed at org.apache.spark.network.client.TransportResponseHandler.channelInactive(TransportResponseHandler.java:128) at org.apache.spark.network.server.TransportChannelHandler.channelInactive(TransportChannelHandler.java:109) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:208) at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:194) at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75) at io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:257) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:208) at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:194) at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:208) at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:194) at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75) at org.apache.spark.network.util.TransportFrameDecoder.channelInactive(TransportFrameDecoder.java:182) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:208) at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:194) at io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:828) at io.netty.channel.AbstractChannel$AbstractUnsafe$7.run(AbstractChannel.java:621) at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) ... 1 more 16/10/14 01:46:00 ERROR StandaloneSchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up. 16/10/14 01:46:00 WARN StandaloneSchedulerBackend: Application ID is not initialized yet. 16/10/14 01:46:00 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 42284. 16/10/14 01:46:00 INFO NettyBlockTransferService: Server created on 172.29.10.49:42284 16/10/14 01:46:00 INFO SparkUI: Stopped Spark web UI at http://172.29.10.49:4040 16/10/14 01:46:00 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 172.29.10.49, 42284) 16/10/14 01:46:00 INFO BlockManagerMasterEndpoint: Registering block manager 172.29.10.49:42284 with 434.4 MB RAM, BlockManagerId(driver, 172.29.10.49, 42284) 16/10/14 01:46:00 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 172.29.10.49, 42284) 16/10/14 01:46:00 INFO StandaloneSchedulerBackend: Shutting down all executors 16/10/14 01:46:00 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asking each executor to shut down 16/10/14 01:46:00 WARN StandaloneAppClient$ClientEndpoint: Drop UnregisterApplication(null) because has not yet connected to master 16/10/14 01:46:00 ERROR MapOutputTrackerMaster: Error communicating with MapOutputTracker java.lang.InterruptedException at java.util.concurrent.locks.AbstractQueuedSynchronizer.tryAcquireSharedNanos(AbstractQueuedSynchronizer.java:1336) at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:208) at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:218) at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:190) at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53) at scala.concurrent.Await$.result(package.scala:190) at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:81) at org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:102) at org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:78) at org.apache.spark.MapOutputTracker.askTracker(MapOutputTracker.scala:100) at org.apache.spark.MapOutputTracker.sendTracker(MapOutputTracker.scala:110) at org.apache.spark.MapOutputTrackerMaster.stop(MapOutputTracker.scala:580) at org.apache.spark.SparkEnv.stop(SparkEnv.scala:84) at org.apache.spark.SparkContext$$anonfun$stop$11.apply$mcV$sp(SparkContext.scala:1814) at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1287) at org.apache.spark.SparkContext.stop(SparkContext.scala:1813) at org.apache.spark.scheduler.cluster.StandaloneSchedulerBackend.dead(StandaloneSchedulerBackend.scala:142) at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint.markDead(StandaloneAppClient.scala:254) at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anon$2.run(StandaloneAppClient.scala:131) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:482) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:345) at java.util.concurrent.FutureTask.run(FutureTask.java:177) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:189) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:303) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1156) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:626) at java.lang.Thread.run(Thread.java:780) 16/10/14 01:46:00 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 16/10/14 01:46:00 ERROR Utils: Uncaught exception in thread appclient-registration-retry-thread org.apache.spark.SparkException: Error communicating with MapOutputTracker at org.apache.spark.MapOutputTracker.askTracker(MapOutputTracker.scala:104) at org.apache.spark.MapOutputTracker.sendTracker(MapOutputTracker.scala:110) at org.apache.spark.MapOutputTrackerMaster.stop(MapOutputTracker.scala:580) at org.apache.spark.SparkEnv.stop(SparkEnv.scala:84) at org.apache.spark.SparkContext$$anonfun$stop$11.apply$mcV$sp(SparkContext.scala:1814) at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1287) at org.apache.spark.SparkContext.stop(SparkContext.scala:1813) at org.apache.spark.scheduler.cluster.StandaloneSchedulerBackend.dead(StandaloneSchedulerBackend.scala:142) at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint.markDead(StandaloneAppClient.scala:254) at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anon$2.run(StandaloneAppClient.scala:131) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:482) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:345) at java.util.concurrent.FutureTask.run(FutureTask.java:177) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:189) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:303) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1156) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:626) at java.lang.Thread.run(Thread.java:780) Caused by: java.lang.InterruptedException at java.util.concurrent.locks.AbstractQueuedSynchronizer.tryAcquireSharedNanos(AbstractQueuedSynchronizer.java:1336) at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:208) at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:218) at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:190) at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53) at scala.concurrent.Await$.result(package.scala:190) at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:81) at org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:102) at org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:78) at org.apache.spark.MapOutputTracker.askTracker(MapOutputTracker.scala:100) ... 17 more 16/10/14 01:46:01 ERROR SparkContext: Error initializing SparkContext. java.lang.NullPointerException at org.apache.spark.SparkContext.<init>(SparkContext.scala:546) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2275) at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831) at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823) at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31) at org.apache.spark.examples.SparkPi.main(SparkPi.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:88) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:55) at java.lang.reflect.Method.invoke(Method.java:613) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 16/10/14 01:46:01 INFO SparkContext: SparkContext already stopped. 16/10/14 01:46:01 INFO SparkContext: Successfully stopped SparkContext Exception in thread "main" java.lang.NullPointerException at org.apache.spark.SparkContext.<init>(SparkContext.scala:546) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2275) at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831) at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823) at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31) at org.apache.spark.examples.SparkPi.main(SparkPi.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:88) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:55) at java.lang.reflect.Method.invoke(Method.java:613) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 16/10/14 01:46:01 INFO DiskBlockManager: Shutdown hook called 16/10/14 01:46:01 INFO ShutdownHookManager: Shutdown hook called 16/10/14 01:46:01 INFO ShutdownHookManager: Deleting directory /tmp/spark-699eaa28-4c25-4361-9775-b019f8fee3a5 16/10/14 01:46:01 INFO ShutdownHookManager: Deleting directory /tmp/spark-699eaa28-4c25-4361-9775-b019f8fee3a5/userFiles-fae1952c-c39e-438c-951b-1f61b846a382 Also in corresponding master log, there are WARN message appearing here. 16/10/14 01:45:01 WARN HttpParser: Illegal character 0x0 in state=START for buffer HeapByteBuffer@5450f8da[p=1,l=1287,c=16384,r=1286]={\x00<<<\x00\x00\x00\x00\x00\x05\x07\x03r\xB9k\xDd \x04\xE7\x96\x00...t\x00\x0c172.29.10.49>>>\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00...\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00} 16/10/14 01:45:01 WARN HttpParser: badMessage: 400 Illegal character 0x0 for HttpChannelOverHttp@f9dd38cb{r=0,c=false,a=IDLE,uri=} 16/10/14 01:45:20 WARN HttpParser: Illegal character 0x0 in state=START for buffer HeapByteBuffer@5450f8da [p=1,l=1287,c=16384,r=1286]={\x00<<<\x00\x00\x00\x00\x00\x05\x07\x03f\x02Q\xBf\xA4\x81=\x08\x00...t\x00\x0c172.29.10.49>>>\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00...\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00} 16/10/14 01:45:20 WARN HttpParser: badMessage: 400 Illegal character 0x0 for HttpChannelOverHttp@a54eedf2{r=0,c=false,a=IDLE,uri=} 16/10/14 01:45:40 WARN HttpParser: Illegal character 0x0 in state=START for buffer HeapByteBuffer@5450f8da [p=1,l=1287,c=16384,r=1286]={\x00<<<\x00\x00\x00\x00\x00\x05\x07\x03L32/W\xD4\x04}\x00...t\x00\x0c172.29.10.49>>>\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00...\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00} 16/10/14 01:45:40 WARN HttpParser: badMessage: 400 Illegal character 0x0 for HttpChannelOverHttp@45ac289{r=0,c=false,a=IDLE,uri=}