spark only needs to be present on the machine that launches it using
spark-submit
On Sat, Dec 17, 2016 at 3:59 PM, Jorge Machado wrote:
> Hi Tiago,
>
> thx for the update. Lat question : but this spark-submit that you are
> using need to be on the same version on all yarn hosts ?
Hi Tiago,
thx for the update. Lat question : but this spark-submit that you are using
need to be on the same version on all yarn hosts ?
Regards
Jorge Machado
> On 17 Dec 2016, at 16:46, Tiago Albineli Motta wrote:
>
> Hi Jorge,
>
> Here we are using an apache
Hi Jorge,
Here we are using an apache hadoop instalation, and to run multiple
versions we just need to change the submit in the client using the correct
spark version you need.
$SPARK_HOME/bin/spark-submit
and pass the correct Spark libs in the conf.
For spark 2.0.0
--conf spark.yarn.archive=
Hi Everyone,
I have one question : is it possible to run like on HDP Spark 1.6.1 and then
run Spark 2.0.0 inside of it ?
Like passing the spark libs with —jars ? The Ideia behind it is not to need to
use the default installation of HDP and be able to deploy new versions of spark
quickly.