Re: library dependencies to run spark local mode

2016-02-04 Thread Ted Yu
Which Spark release are you using ?

Is there other clue from the logs ? If so, please pastebin.

Cheers

On Thu, Feb 4, 2016 at 2:49 AM, Valentin Popov 
wrote:

> Hi all,
>
> I’m trying run spark on local mode, i using such code:
>
> SparkConf conf = new SparkConf().setAppName("JavaWord2VecExample"
> ).setMaster("local[*]");
> JavaSparkContext sc = new JavaSparkContext(conf);
>
> but after while (10 sec) I got Exception, here is a stack trace:
> java.util.concurrent.TimeoutException: Futures timed out after [1
> milliseconds]
> at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
> at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
> at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
> at
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
> at scala.concurrent.Await$.result(package.scala:107)
> at akka.remote.Remoting.start(Remoting.scala:179)
> at
> akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
> at akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:620)
> at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:617)
> at akka.actor.ActorSystemImpl._start(ActorSystem.scala:617)
> at akka.actor.ActorSystemImpl.start(ActorSystem.scala:634)
> at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
> at akka.actor.ActorSystem$.apply(ActorSystem.scala:119)
> at
> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:52)
> at
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1964)
> at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
> at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1955)
> at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:55)
> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:266)
> at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)
> at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:288)
> at org.apache.spark.SparkContext.(SparkContext.scala:457)
> at
> org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:59)
> at
> com.stimulus.archiva.datamining.ml.Word2VecTest.word2vec(Word2VecTest.java:23)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
> at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
> at
> org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:86)
> at
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:459)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:675)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)
>
>
>
> Any one know library dependencies that can cause such error?
>
> Regards,
> Valentin
>
>
>
>
>


Re: library dependencies to run spark local mode

2016-02-04 Thread Valentin Popov
It is 1.6.0 builded from sources. 

I’m trying it on mine eclipse project and want use spark on it, so I put 
libraries there and have no ClassNotFoundException 

akka-actor_2.10-2.3.11.jar
akka-remote_2.10-2.3.11.jar
akka-slf4j_2.10-2.3.11.jar
config-1.2.1.jar
hadoop-auth-2.7.1.jar
hadoop-common-2.7.1.jar
hadoop-mapreduce-client-common-2.7.1.jar
hadoop-mapreduce-client-core-2.7.1.jar
hadoop-mapreduce-client-jobclient-2.7.1.jar
hadoop-mapreduce-client-shuffle-2.7.1.jar
hadoop-yarn-api-2.7.1.jar
hadoop-yarn-common-2.7.1.jar
hamcrest-core-1.3.jar
netty-all-4.0.29.Final.jar
scala-library-2.10.6.jar
spark-core_2.10-1.6.0.jar
spark-mllib_2.10-1.6.0.jar
spark-network-common_2.10-1.6.0.jar
spark-network-shuffle_2.10-1.6.0.jar
spark-sql_2.10-1.6.0.jar
spark-streaming_2.10-1.6.0.jar


> On 4 февр. 2016 г., at 13:51, Ted Yu  wrote:
> 
> Which Spark release are you using ?
> 
> Is there other clue from the logs ? If so, please pastebin.
> 
> Cheers
> 
> On Thu, Feb 4, 2016 at 2:49 AM, Valentin Popov  > wrote:
> Hi all, 
> 
> I’m trying run spark on local mode, i using such code: 
> 
> SparkConf conf = new 
> SparkConf().setAppName("JavaWord2VecExample").setMaster("local[*]");
> JavaSparkContext sc = new JavaSparkContext(conf);
> 
> but after while (10 sec) I got Exception, here is a stack trace: 
> java.util.concurrent.TimeoutException: Futures timed out after [1 
> milliseconds]
>   at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>   at 
> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>   at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
>   at 
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>   at scala.concurrent.Await$.result(package.scala:107)
>   at akka.remote.Remoting.start(Remoting.scala:179)
>   at 
> akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
>   at akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:620)
>   at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:617)
>   at akka.actor.ActorSystemImpl._start(ActorSystem.scala:617)
>   at akka.actor.ActorSystemImpl.start(ActorSystem.scala:634)
>   at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
>   at akka.actor.ActorSystem$.apply(ActorSystem.scala:119)
>   at 
> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
>   at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
>   at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:52)
>   at 
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1964)
>   at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>   at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1955)
>   at 
> org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:55)
>   at org.apache.spark.SparkEnv$.create(SparkEnv.scala:266)
>   at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)
>   at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:288)
>   at org.apache.spark.SparkContext.(SparkContext.scala:457)
>   at 
> org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:59)
>   at 
> com.stimulus.archiva.datamining.ml.Word2VecTest.word2vec(Word2VecTest.java:23)
>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>   at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>   at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:497)
>   at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
>   at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>   at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
>   at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>   at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
>   at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
>   at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
>   at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
>   at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
>   at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
>   at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
>   at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
>   at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
>   at 
> 

library dependencies to run spark local mode

2016-02-04 Thread Valentin Popov
Hi all, 

I’m trying run spark on local mode, i using such code: 

SparkConf conf = new 
SparkConf().setAppName("JavaWord2VecExample").setMaster("local[*]");
JavaSparkContext sc = new JavaSparkContext(conf);

but after while (10 sec) I got Exception, here is a stack trace: 
java.util.concurrent.TimeoutException: Futures timed out after [1 
milliseconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at 
scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at 
scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at akka.remote.Remoting.start(Remoting.scala:179)
at 
akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
at akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:620)
at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:617)
at akka.actor.ActorSystemImpl._start(ActorSystem.scala:617)
at akka.actor.ActorSystemImpl.start(ActorSystem.scala:634)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:119)
at 
org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:52)
at 
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1964)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1955)
at 
org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:55)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:266)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:288)
at org.apache.spark.SparkContext.(SparkContext.scala:457)
at 
org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:59)
at 
com.stimulus.archiva.datamining.ml.Word2VecTest.word2vec(Word2VecTest.java:23)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at 
org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:86)
at 
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at 
org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:459)
at 
org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:675)
at 
org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382)
at 
org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)



Any one know library dependencies that can cause such error? 

Regards,
Valentin






Re: library dependencies to run spark local mode

2016-02-04 Thread Valentin Popov
I think this is an answer… 


HADOOP_HOME or hadoop.home.dir are not set.

Sorry


2016-02-04 14:10:08 o.a.h.u.Shell [DEBUG] Failed to detect a valid hadoop home 
directory
java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:303) 
[hadoop-common-2.7.1.jar:na]
at org.apache.hadoop.util.Shell.(Shell.java:328) 
[hadoop-common-2.7.1.jar:na]
at org.apache.hadoop.util.StringUtils.(StringUtils.java:80) 
[hadoop-common-2.7.1.jar:na]
at 
org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:610)
 [hadoop-common-2.7.1.jar:na]
at 
org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:272)
 [hadoop-common-2.7.1.jar:na]
at 
org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260)
 [hadoop-common-2.7.1.jar:na]
at 
org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:790)
 [hadoop-common-2.7.1.jar:na]
at 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:760)
 [hadoop-common-2.7.1.jar:na]
at 
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:633)
 [hadoop-common-2.7.1.jar:na]
at 
org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2136)
 [spark-core_2.10-1.6.0.jar:1.6.0]
at 
org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2136)
 [spark-core_2.10-1.6.0.jar:1.6.0]
at scala.Option.getOrElse(Option.scala:120) 
[scala-library-2.10.6.jar:na]
at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2136) 
[spark-core_2.10-1.6.0.jar:1.6.0]
at org.apache.spark.SparkContext.(SparkContext.scala:322) 
[spark-core_2.10-1.6.0.jar:1.6.0]
at 
org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:59) 
[spark-core_2.10-1.6.0.jar:1.6.0]
at 
com.stimulus.archiva.datamining.ml.Word2VecTest.word2vec(Word2VecTest.java:23) 
[classes/:na]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
~[na:1.8.0_71]
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[na:1.8.0_71]
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[na:1.8.0_71]
at java.lang.reflect.Method.invoke(Method.java:497) ~[na:1.8.0_71]
at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
 [junit-4.12.jar:4.12]
at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
 [junit-4.12.jar:4.12]
at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
 [junit-4.12.jar:4.12]
at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
 [junit-4.12.jar:4.12]
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325) 
[junit-4.12.jar:4.12]
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
 [junit-4.12.jar:4.12]
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
 [junit-4.12.jar:4.12]
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290) 
[junit-4.12.jar:4.12]
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71) 
[junit-4.12.jar:4.12]
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288) 
[junit-4.12.jar:4.12]
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58) 
[junit-4.12.jar:4.12]
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268) 
[junit-4.12.jar:4.12]
at org.junit.runners.ParentRunner.run(ParentRunner.java:363) 
[junit-4.12.jar:4.12]
at 
org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:86)
 [.cp/:na]
at 
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38) 
[.cp/:na]
at 
org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:459)
 [.cp/:na]
at 
org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:675)
 [.cp/:na]
at 
org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382)
 [.cp/:na]
at 
org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)
 [.cp/:na]
2016-02-04 14:10:08 o.a.h.u.Shell [DEBUG] setsid is not available on this 
machine. So not using it.
2016-02-04 14:10:08 o.a.h.u.Shell [DEBUG] setsid exited with exit code 0
2016-02-04 14:10:08 o.a.h.s.a.u.KerberosName [DEBUG] Kerberos krb5 
configuration not found, setting default realm to empty
2016-02-04 14:10:08 o.a.h.s.Groups [DEBUG]  Creating new Groups object
2016-02-04 14:10:08 o.a.h.u.NativeCodeLoader [DEBUG] Trying to load the