looking into the work folder of problematic application, seems that the
application is continuing creating executors, and error log of worker is as
below:
Exception in thread "main" java.lang.reflect.UndeclaredThrowableException:
Unknown exception in doAs
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1134)
        at
org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:52)
        at
org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:113)
        at
org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:156)
        at
org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
Caused by: java.security.PrivilegedActionException:
java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        ... 4 more
Caused by: java.util.concurrent.TimeoutException: Futures timed out after
[30 seconds]
        at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
        at 
scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
        at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
        at
scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
        at scala.concurrent.Await$.result(package.scala:107)
        at
org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$run$1.apply$mcV$sp(CoarseGrainedExecutorBackend.scala:125)
        at
org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:53)
        at
org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:52)
        ... 7 more

In master's log, I also see errors are generated continuously:
...
15/04/19 17:24:20 ERROR EndpointWriter: dropping message [class
org.apache.spark.deploy.DeployMessages$ExecutorUpdated] for non-local
recipient [Actor[akka.tcp://sparkDriver@spark101:46215/user/$b#-642262056]]
arriving at [akka.tcp://sparkDriver@spark101:46215] inbound addresses are
[akka.tcp://sparkMaster@spark101:7077]
15/04/19 17:24:20 ERROR EndpointWriter: dropping message [class
org.apache.spark.deploy.DeployMessages$ExecutorAdded] for non-local
recipient [Actor[akka.tcp://sparkDriver@spark101:46215/user/$b#-642262056]]
arriving at [akka.tcp://sparkDriver@spark101:46215] inbound addresses are
[akka.tcp://sparkMaster@spark101:7077]
15/04/19 17:24:20 ERROR EndpointWriter: dropping message [class
org.apache.spark.deploy.DeployMessages$ExecutorUpdated] for non-local
recipient [Actor[akka.tcp://sparkDriver@spark101:46215/user/$b#-642262056]]
arriving at [akka.tcp://sparkDriver@spark101:46215] inbound addresses are
[akka.tcp://sparkMaster@spark101:7077]
15/04/19 17:24:21 ERROR EndpointWriter: dropping message [class
org.apache.spark.deploy.DeployMessages$ExecutorUpdated] for non-local
recipient [Actor[akka.tcp://sparkDriver@spark101:46215/user/$b#-642262056]]
arriving at [akka.tcp://sparkDriver@spark101:46215] inbound addresses are
[akka.tcp://sparkMaster@spark101:7077]
15/04/19 17:24:21 ERROR UserGroupInformation: PriviledgedActionException
as:root cause:java.util.concurrent.TimeoutException: Futures timed out after
[30 seconds]
15/04/19 17:24:22 ERROR EndpointWriter: dropping message [class
org.apache.spark.deploy.DeployMessages$ExecutorUpdated] for non-local
recipient [Actor[akka.tcp://sparkDriver@spark101:53140/user/$b#-510580371]]
arriving at [akka.tcp://sparkDriver@spark101:53140] inbound addresses are
[akka.tcp://sparkMaster@spark101:7077]
15/04/19 17:24:22 ERROR EndpointWriter: dropping message [class
org.apache.spark.deploy.DeployMessages$ExecutorAdded] for non-local
recipient [Actor[akka.tcp://sparkDriver@spark101:53140/user/$b#-510580371]]
arriving at [akka.tcp://sparkDriver@spark101:53140] inbound addresses are
[akka.tcp://sparkMaster@spark101:7077]
...



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/spark-application-was-submitted-twice-unexpectedly-tp22551p22560.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to