I followed the steps described above and I still get this error:

Error: Could not find or load main class
org.apache.spark.deploy.yarn.ExecutorLauncher


I am trying to build spark 1.3 on hdp 2.2.
I built spark from source using:
build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.6.0 -Phive
-Phive-thriftserver -DskipTests package

Maybe I am not putting the correct yarn assembly on hdfs or some other
issue?

Thanks,
Udit

On Mon, Mar 30, 2015 at 10:18 AM, Zhan Zhang <zzh...@hortonworks.com> wrote:

>  Hi Folks,
>
>  Just to summarize it to run SPARK on HDP distribution.
>
>  1. The spark version has to be 1.3.0 and above if you are using upstream
> distribution.  This configuration is mainly for HDP rolling upgrade
> purpose, and the patch only went into spark upstream from 1.3.0.
>
>  2. In $SPARK_HOME/conf/sp[ark-defaults.conf, adding following settings.
>     spark.driver.extraJavaOptions -Dhdp.version=xxxxx
>
>    spark.yarn.am.extraJavaOptions -Dhdp.version=xxxxx
>
>  3. In $SPARK_HOME/java-opts, add following options.
>    -Dhdp.version=xxxxx
>
>  Thanks.
>
>  Zhan Zhang
>
>
>
>  On Mar 30, 2015, at 6:56 AM, Doug Balog <doug.sparku...@dugos.com> wrote:
>
> The “best” solution to spark-shell’s  problem is creating a file
> $SPARK_HOME/conf/java-opts
> with “-Dhdp.version=2.2.0.0-2014”
>
> Cheers,
>
> Doug
>
> On Mar 28, 2015, at 1:25 PM, Michael Stone <mst...@mathom.us> wrote:
>
> I've also been having trouble running 1.3.0 on HDP. The
> spark.yarn.am.extraJavaOptions -Dhdp.version=2.2.0.0-2041
> configuration directive seems to work with pyspark, but not propagate when
> using spark-shell. (That is, everything works find with pyspark, and
> spark-shell fails with the "bad substitution" message.)
>
> Mike Stone
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>
>

Reply via email to