Re: JAVA_HOME problem

2015-04-28 Thread Marcelo Vanzin
Are you using a Spark build that matches your YARN cluster version? That seems like it could happen if you're using a Spark built against a newer version of YARN than you're running. On Thu, Apr 2, 2015 at 12:53 AM, 董帅阳 917361...@qq.com wrote: spark 1.3.0 spark@pc-zjqdyyn1:~ tail

Re: JAVA_HOME problem

2015-04-28 Thread sourabh chaki
I was able to solve this problem hard coding the JAVA_HOME inside org.apache.spark.deploy.yarn.Client.scala class. *val commands = prefixEnv ++ Seq(-- YarnSparkHadoopUtil.expandEnvironment(Environment.JAVA_HOME) + /bin/java, -server++ /usr/java/jdk1.7.0_51/bin/java, -server)* Somehow

Re: JAVA_HOME problem

2015-04-24 Thread sourabh chaki
Yes Akhil. This is the same issue. I have updated my comment in that ticket. Thanks Sourabh On Fri, Apr 24, 2015 at 12:02 PM, Akhil Das ak...@sigmoidanalytics.com wrote: Isn't this related to this https://issues.apache.org/jira/browse/SPARK-6681 Thanks Best Regards On Fri, Apr 24, 2015

Re: JAVA_HOME problem

2015-04-24 Thread sourabh chaki
I am also facing the same problem with spark 1.3.0 and yarn-client and yarn-cluster mode. Launching yarn container failed and this is the error in stderr: Container: container_1429709079342_65869_01_01

JAVA_HOME problem

2015-04-02 Thread 董帅阳
spark 1.3.0 spark@pc-zjqdyyn1:~ tail /etc/profile export JAVA_HOME=/usr/jdk64/jdk1.7.0_45 export PATH=$PATH:$JAVA_HOME/bin # # End of /etc/profile #‍ But ERROR LOG Container: container_1427449644855_0092_02_01 on pc-zjqdyy04_45454

Re: JAVA_HOME problem with upgrade to 1.3.0

2015-03-23 Thread Williams, Ken
From: Williams, Ken Williams ken.willi...@windlogics.commailto:ken.willi...@windlogics.com Date: Thursday, March 19, 2015 at 10:59 AM To: Spark list user@spark.apache.orgmailto:user@spark.apache.org Subject: JAVA_HOME problem with upgrade to 1.3.0 […] Finally, I go and check the YARN

JAVA_HOME problem with upgrade to 1.3.0

2015-03-19 Thread Williams, Ken
I’m trying to upgrade a Spark project, written in Scala, from Spark 1.2.1 to 1.3.0, so I changed my `build.sbt` like so: -libraryDependencies += org.apache.spark %% spark-core % 1.2.1 % “provided +libraryDependencies += org.apache.spark %% spark-core % 1.3.0 % provided then make an

Re: JAVA_HOME problem with upgrade to 1.3.0

2015-03-19 Thread Ted Yu
JAVA_HOME, an environment variable, should be defined on the node where appattempt_1420225286501_4699_02 ran. Cheers On Thu, Mar 19, 2015 at 8:59 AM, Williams, Ken ken.willi...@windlogics.com wrote: I’m trying to upgrade a Spark project, written in Scala, from Spark 1.2.1 to 1.3.0, so I

Re: JAVA_HOME problem with upgrade to 1.3.0

2015-03-19 Thread Williams, Ken
From: Ted Yu yuzhih...@gmail.commailto:yuzhih...@gmail.com Date: Thursday, March 19, 2015 at 11:05 AM JAVA_HOME, an environment variable, should be defined on the node where appattempt_1420225286501_4699_02 ran. Has this behavior changed in 1.3.0 since 1.2.1 though? Using 1.2.1 and