UserGroupInformation: PriviledgedActionException as

2015-01-15 Thread spraveen
Hi,

When I am trying to run a program in a remote spark machine I am getting
this below exception : 
15/01/16 11:14:39 ERROR UserGroupInformation: PriviledgedActionException
as:user1 (auth:SIMPLE) cause:java.util.concurrent.TimeoutException: Futures
timed out after [30 seconds]
Exception in thread main java.lang.reflect.UndeclaredThrowableException:
Unknown exception in doAs
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1421)
at
org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:59)
at
org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:115)
at
org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:163)
at
org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
Caused by: java.security.PrivilegedActionException:
java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
... 4 more
Caused by: java.util.concurrent.TimeoutException: Futures timed out after
[30 seconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at 
scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at
scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at
org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$run$1.apply$mcV$sp(CoarseGrainedExecutorBackend.scala:127)
at
org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:60)
at
org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:59)


I am executing this driver class in one machine say(abc) but the spark is
installed on another machine (xyz:7077).
When I execute the same driver class pointing to local it works fine.

public class TestSpark implements Serializable{


public static void main(String[] args) throws IOException {
TestSpark test = new TestSpark();
test.testReduce();
}

public void testReduce() throws IOException {

SparkConf conf = new
SparkConf().setMaster(spark://xyz:7077).setAppName(Sample App);
String[] pathToJar = {/home/user1/Desktop/Jars/TestSpark.jar};

//SparkConf conf = new
SparkConf().setMaster(spark://abc:7077).setAppName(Sample
App).setJars(pathToJar);
//SparkConf conf = new 
SparkConf().setMaster(local).setAppName(Sample
App);

JavaSparkContext jsc = new JavaSparkContext(conf);

ListInteger data = new ArrayListInteger();

for(int i=1;i500;i++){
data.add(i);
}

System.out.println(Size : +data.size());

JavaRDDInteger distData = jsc.parallelize(data);

Integer total = distData.reduce(new Function2Integer, Integer, 
Integer()
{
@Override
public Integer call(Integer v1, Integer v2) throws 
Exception {
String s1 = v1 :  + v1 +  v2 : + v2 +  
-- + this;
return v1 + v2;
}
}); 
System.out.println(--+total );   
}
}

I have also tried setting spark.driver.host :: xyz  spark.driver.port ::
7077 while creating the spark context, but it did not help.

Please advice




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/UserGroupInformation-PriviledgedActionException-as-tp21182.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: ERROR UserGroupInformation: PriviledgedActionException

2014-11-05 Thread Saiph Kappa
I am running the same version of spark in the server (master + worker) and
in the client / driver.

For the server I am using the binaries spark-1.1.0-bin-hadoop1
And in the client I am using the same version:

dependency
 groupIdorg.apache.spark/groupId
 artifactIdspark-core_2.10/artifactId
 version1.1.0/version
 /dependency
 dependency
 groupIdorg.apache.spark/groupId
 artifactIdspark-streaming_2.10/artifactId
 version1.1.0/version
 /dependency
 dependency
 groupIdorg.apache.spark/groupId
 artifactIdspark-streaming-twitter_2.10/artifactId
 version1.1.0/version
 /dependency
 dependency
 groupIdorg.apache.spark/groupId
 artifactIdspark-examples_2.10/artifactId
 version1.1.0/version
 /dependency




On Wed, Nov 5, 2014 at 6:32 AM, Akhil Das ak...@sigmoidanalytics.com
wrote:

 Its more like you are having different versions of spark

 Thanks
 Best Regards

 On Wed, Nov 5, 2014 at 3:05 AM, Saiph Kappa saiph.ka...@gmail.com wrote:

 I set the host and port of the driver and now the error slightly changed

 Using Spark's default log4j profile:
 org/apache/spark/log4j-defaults.properties
 14/11/04 21:13:48 INFO CoarseGrainedExecutorBackend: Registered signal
 handlers for [TERM, HUP, INT]
 14/11/04 21:13:48 INFO SecurityManager: Changing view acls to:
 myuser,Myuser
 14/11/04 21:13:48 INFO SecurityManager: Changing modify acls to:
 myuser,Myuser
 14/11/04 21:13:48 INFO SecurityManager: SecurityManager: authentication
 disabled; ui acls disabled; users with view permissions: Set(myuser,
 Myuser); users with modify permissions: Set(myuser, Myuser)
 14/11/04 21:13:48 INFO Slf4jLogger: Slf4jLogger started
 14/11/04 21:13:48 INFO Remoting: Starting remoting
 14/11/04 21:13:49 INFO Remoting: Remoting started; listening on
 addresses :[akka.tcp://driverPropsFetcher@myserver:37456]
 14/11/04 21:13:49 INFO Remoting: Remoting now listens on addresses:
 [akka.tcp://driverPropsFetcher@myserver:37456]
 14/11/04 21:13:49 INFO Utils: Successfully started service
 'driverPropsFetcher' on port 37456.
 14/11/04 21:14:19 ERROR UserGroupInformation: PriviledgedActionException
 as:Myuser cause:java.util.concurrent.TimeoutException: Futures timed out
 after [30 seconds]
 Exception in thread main
 java.lang.reflect.UndeclaredThrowableException: Unknown exception in doAs
 at
 org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1134)
 at
 org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:52)
 at
 org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:113)
 at
 org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:156)
 at
 org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
 Caused by: java.security.PrivilegedActionException:
 java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:415)
 at
 org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
 ... 4 more
 Caused by: java.util.concurrent.TimeoutException: Futures timed out
 after [30 seconds]
 at
 scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
 at
 scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
 at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
 at
 scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
 at scala.concurrent.Await$.result(package.scala:107)
 at
 org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$run$1.apply$mcV$sp(CoarseGrainedExecutorBackend.scala:125)
 at
 org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:53)
 at
 org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:52)
 ... 7 more


 Any ideas?

 Thanks.

 On Tue, Nov 4, 2014 at 11:29 AM, Akhil Das ak...@sigmoidanalytics.com
 wrote:

 If you want to run the spark application from a remote machine, then you
 have to at least set the following configurations properly.

 *spark.driver.host* - points to the ip/host from where you are
 submitting the job (make sure you are able to ping this from the cluster)

 *spark.driver.port* - set it to a port number which is accessible from
 the spark cluster.

 You can look at more configuration options over here.
 http://spark.apache.org/docs/latest/configuration.html#networking

 Thanks
 Best Regards

 On Tue, Nov 4, 2014 at 6:07 AM, Saiph Kappa saiph.ka...@gmail.com
 wrote:

 Hi,

 I am trying to submit a job to a spark cluster running on a single
 machine (1 master + 1 worker) with hadoop 1.0.4. I submit

Re: ERROR UserGroupInformation: PriviledgedActionException

2014-11-04 Thread Akhil Das
Its more like you are having different versions of spark

Thanks
Best Regards

On Wed, Nov 5, 2014 at 3:05 AM, Saiph Kappa saiph.ka...@gmail.com wrote:

 I set the host and port of the driver and now the error slightly changed

 Using Spark's default log4j profile:
 org/apache/spark/log4j-defaults.properties
 14/11/04 21:13:48 INFO CoarseGrainedExecutorBackend: Registered signal
 handlers for [TERM, HUP, INT]
 14/11/04 21:13:48 INFO SecurityManager: Changing view acls to:
 myuser,Myuser
 14/11/04 21:13:48 INFO SecurityManager: Changing modify acls to:
 myuser,Myuser
 14/11/04 21:13:48 INFO SecurityManager: SecurityManager: authentication
 disabled; ui acls disabled; users with view permissions: Set(myuser,
 Myuser); users with modify permissions: Set(myuser, Myuser)
 14/11/04 21:13:48 INFO Slf4jLogger: Slf4jLogger started
 14/11/04 21:13:48 INFO Remoting: Starting remoting
 14/11/04 21:13:49 INFO Remoting: Remoting started; listening on addresses
 :[akka.tcp://driverPropsFetcher@myserver:37456]
 14/11/04 21:13:49 INFO Remoting: Remoting now listens on addresses:
 [akka.tcp://driverPropsFetcher@myserver:37456]
 14/11/04 21:13:49 INFO Utils: Successfully started service
 'driverPropsFetcher' on port 37456.
 14/11/04 21:14:19 ERROR UserGroupInformation: PriviledgedActionException
 as:Myuser cause:java.util.concurrent.TimeoutException: Futures timed out
 after [30 seconds]
 Exception in thread main
 java.lang.reflect.UndeclaredThrowableException: Unknown exception in doAs
 at
 org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1134)
 at
 org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:52)
 at
 org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:113)
 at
 org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:156)
 at
 org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
 Caused by: java.security.PrivilegedActionException:
 java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:415)
 at
 org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
 ... 4 more
 Caused by: java.util.concurrent.TimeoutException: Futures timed out after
 [30 seconds]
 at
 scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
 at
 scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
 at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
 at
 scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
 at scala.concurrent.Await$.result(package.scala:107)
 at
 org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$run$1.apply$mcV$sp(CoarseGrainedExecutorBackend.scala:125)
 at
 org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:53)
 at
 org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:52)
 ... 7 more


 Any ideas?

 Thanks.

 On Tue, Nov 4, 2014 at 11:29 AM, Akhil Das ak...@sigmoidanalytics.com
 wrote:

 If you want to run the spark application from a remote machine, then you
 have to at least set the following configurations properly.

 *spark.driver.host* - points to the ip/host from where you are
 submitting the job (make sure you are able to ping this from the cluster)

 *spark.driver.port* - set it to a port number which is accessible from
 the spark cluster.

 You can look at more configuration options over here.
 http://spark.apache.org/docs/latest/configuration.html#networking

 Thanks
 Best Regards

 On Tue, Nov 4, 2014 at 6:07 AM, Saiph Kappa saiph.ka...@gmail.com
 wrote:

 Hi,

 I am trying to submit a job to a spark cluster running on a single
 machine (1 master + 1 worker) with hadoop 1.0.4. I submit it in the code:
 «val sparkConf = new
 SparkConf().setMaster(spark://myserver:7077).setAppName(MyApp).setJars(Array(target/my-app-1.0-SNAPSHOT.jar))».

 When I run this application on the same machine as the cluster
 everything works fine.

 But when I run it from a remote machine I get the following error:

 Using Spark's default log4j profile:
 org/apache/spark/log4j-defaults.properties
 14/11/04 00:15:38 INFO CoarseGrainedExecutorBackend: Registered signal
 handlers for [TERM, HUP, INT]
 14/11/04 00:15:38 INFO SecurityManager: Changing view acls to:
 myuser,Myuser
 14/11/04 00:15:38 INFO SecurityManager: Changing modify acls to:
 myuser,Myuser
 14/11/04 00:15:38 INFO SecurityManager: SecurityManager: authentication
 disabled; ui acls disabled; users with view permissions: Set(myuser,
 Myuser); users with modify permissions: Set(myuser, Myuser)
 14/11/04 00:15:38 INFO Slf4jLogger: Slf4jLogger started
 14/11/04 00:15:38 INFO Remoting: Starting remoting
 14/11/04 00:15:38 INFO Remoting