Are you using a Spark build that matches your YARN cluster version?
That seems like it could happen if you're using a Spark built against
a newer version of YARN than you're running.
On Thu, Apr 2, 2015 at 12:53 AM, 董帅阳 <917361...@qq.com> wrote:
> spark 1.3.0
>
>
> spark@pc-zjqdyyn1:~> tail /etc/
I was able to solve this problem hard coding the JAVA_HOME inside
org.apache.spark.deploy.yarn.Client.scala class.
*val commands = prefixEnv ++ Seq(--
YarnSparkHadoopUtil.expandEnvironment(Environment.JAVA_HOME) +
"/bin/java", "-server"++ "/usr/java/jdk1.7.0_51/bin/java", "-server")*
Somehow {
Yes Akhil. This is the same issue. I have updated my comment in that ticket.
Thanks
Sourabh
On Fri, Apr 24, 2015 at 12:02 PM, Akhil Das
wrote:
> Isn't this related to this
> https://issues.apache.org/jira/browse/SPARK-6681
>
> Thanks
> Best Regards
>
> On Fri, Apr 24, 2015 at 11:40 AM, sourabh
Isn't this related to this https://issues.apache.org/jira/browse/SPARK-6681
Thanks
Best Regards
On Fri, Apr 24, 2015 at 11:40 AM, sourabh chaki
wrote:
> I am also facing the same problem with spark 1.3.0 and yarn-client and
> yarn-cluster mode. Launching yarn container failed and this is the er
I am also facing the same problem with spark 1.3.0 and yarn-client and
yarn-cluster mode. Launching yarn container failed and this is the error in
stderr:
Container: container_1429709079342_65869_01_01
spark 1.3.0
spark@pc-zjqdyyn1:~> tail /etc/profile
export JAVA_HOME=/usr/jdk64/jdk1.7.0_45
export PATH=$PATH:$JAVA_HOME/bin
#
# End of /etc/profile
#
But ERROR LOG
Container: container_1427449644855_0092_02_01 on pc-zjqdyy04_45454
==
> From: , Ken Williams
> mailto:ken.willi...@windlogics.com>>
> Date: Thursday, March 19, 2015 at 10:59 AM
> To: Spark list mailto:user@spark.apache.org>>
> Subject: JAVA_HOME problem with upgrade to 1.3.0
>
> […]
> Finally, I go and check the YARN app mast
> From: Ted Yu mailto:yuzhih...@gmail.com>>
> Date: Thursday, March 19, 2015 at 11:05 AM
>
> JAVA_HOME, an environment variable, should be defined on the node where
> appattempt_1420225286501_4699_02 ran.
Has this behavior changed in 1.3.0 since 1.2.1 though? Using 1.2.1 and making
no othe
JAVA_HOME, an environment variable, should be defined on the node where
appattempt_1420225286501_4699_02 ran.
Cheers
On Thu, Mar 19, 2015 at 8:59 AM, Williams, Ken
wrote:
> I’m trying to upgrade a Spark project, written in Scala, from Spark
> 1.2.1 to 1.3.0, so I changed my `build.sbt` lik
I’m trying to upgrade a Spark project, written in Scala, from Spark 1.2.1 to
1.3.0, so I changed my `build.sbt` like so:
-libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.1" %
“provided"
+libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.0" %
"provided"
the
10 matches
Mail list logo