Hi all,

I'm trying to run the sample Spark application in version v1.2.0 and above. 
However, I've encountered a weird issue like below.  This issue only be seen
in v1.2.0 and above, but v1.1.0 and v1.1.1 are fine.

The sample code:
val sc : SparkContext = new SparkContext(conf)

  val NUM_SAMPLES = 10
  val count = sc.parallelize(1 to NUM_SAMPLES).map{i =>
    val x = Math.random()
    val y = Math.random()
    if (x*x + y*y < 1) 1 else 0
  }.reduce(_ + _)
  println("Pi is roughly " + 4.0 * count / NUM_SAMPLES)

The exception:
************************************************************
Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties
15/03/19 01:10:17 INFO CoarseGrainedExecutorBackend: Registered signal
handlers for [TERM, HUP, INT]
15/03/19 01:10:17 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
15/03/19 01:10:17 INFO SecurityManager: Changing view acls to:
hduser,eason.hu
15/03/19 01:10:17 INFO SecurityManager: Changing modify acls to:
hduser,eason.hu
15/03/19 01:10:17 INFO SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users with view permissions: Set(hduser,
eason.hu); users with modify permissions: Set(hduser, eason.hu)
15/03/19 01:10:18 INFO Slf4jLogger: Slf4jLogger started
15/03/19 01:10:18 INFO Remoting: Starting remoting
15/03/19 01:10:18 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://driverPropsFetcher@hduser-07:59122]
15/03/19 01:10:18 INFO Utils: Successfully started service
'driverPropsFetcher' on port 59122.
15/03/19 01:10:21 WARN ReliableDeliverySupervisor: Association with remote
system [akka.tcp://sparkDriver@192.168.1.53:65001] has failed, address is
now gated for [5000] ms. Reason is: [Association failed with
[akka.tcp://sparkDriver@192.168.1.53:65001]].
15/03/19 01:10:48 ERROR UserGroupInformation: PriviledgedActionException
as:eason.hu (auth:SIMPLE) cause:java.util.concurrent.TimeoutException:
Futures timed out after [30 seconds]
Exception in thread "main" java.lang.reflect.UndeclaredThrowableException:
Unknown exception in doAs
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1421)
        at
org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:59)
        at
org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:128)
        at
org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:224)
        at
org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
Caused by: java.security.PrivilegedActionException:
java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
        ... 4 more
Caused by: java.util.concurrent.TimeoutException: Futures timed out after
[30 seconds]
        at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
        at 
scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
        at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
        at
scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
        at scala.concurrent.Await$.result(package.scala:107)
        at
org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$run$1.apply$mcV$sp(CoarseGrainedExecutorBackend.scala:144)
        at
org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:60)
        at
org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:59)
        ... 7 more
************************************************************

Do you have any clues why it happens only after v1.2.0 and above?  How to
resolve this issue?

Thank you very much,
Eason



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Timeout-Issues-from-Spark-1-2-0-tp22150.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to