> From: , Ken Williams
> mailto:ken.willi...@windlogics.com>>
> Date: Thursday, March 19, 2015 at 10:59 AM
> To: Spark list mailto:user@spark.apache.org>>
> Subject: JAVA_HOME problem with upgrade to 1.3.0
>
> […]
> Finally, I go and check the YARN app mast
> From: Ted Yu mailto:yuzhih...@gmail.com>>
> Date: Thursday, March 19, 2015 at 11:05 AM
>
> JAVA_HOME, an environment variable, should be defined on the node where
> appattempt_1420225286501_4699_02 ran.
Has this behavior changed in 1.3.0 since 1.2.1 though? Using 1.2.1 and making
no othe
JAVA_HOME, an environment variable, should be defined on the node where
appattempt_1420225286501_4699_02 ran.
Cheers
On Thu, Mar 19, 2015 at 8:59 AM, Williams, Ken
wrote:
> I’m trying to upgrade a Spark project, written in Scala, from Spark
> 1.2.1 to 1.3.0, so I changed my `build.sbt` lik
I’m trying to upgrade a Spark project, written in Scala, from Spark 1.2.1 to
1.3.0, so I changed my `build.sbt` like so:
-libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.1" %
“provided"
+libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.0" %
"provided"
the