I tried also sc.stop(). Sorry I didnot include that in my question, but still
getting thread exception. It is also need to mention that I am working on VM
Machine.

15/07/07 06:00:32 ERROR ActorSystemImpl: Uncaught error from thread
[sparkDriver-akka.actor.default-dispatcher-5]
java.lang.InterruptedException: Interrupted while processing system messages
    at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:265)
    at akka.dispatch.Mailbox.run(Mailbox.scala:219)
    at
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
15/07/07 06:00:32 INFO RemoteActorRefProvider$RemotingTerminator: Shutting
down remote daemon.
15/07/07 06:00:32 INFO RemoteActorRefProvider$RemotingTerminator: Remote
daemon shut down; proceeding with flushing remote transports.
15/07/07 06:00:32 INFO RemoteActorRefProvider$RemotingTerminator: Remoting
shut down.
[WARNING] thread Thread[ForkJoinPool-3-worker-3,5,SimpleApp] was interrupted
but is still alive after waiting at least 12877msecs
[WARNING] thread Thread[ForkJoinPool-3-worker-3,5,SimpleApp] will linger
despite being asked to die via interruption
[WARNING] NOTE: 1 thread(s) did not finish despite being asked to  via
interruption. This is not a problem with exec:java, it is a problem with the
running code. Although not serious, it should be remedied.
[INFO] Total time: 29.896s




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-solve-ThreadException-in-Apache-Spark-standalone-Java-Application-tp23675p23679.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to