You can specify spark properties as mentioned here
http://spark.apache.org/docs/latest/configuration.html
They will be passed to spark via --conf
Manuel Sopena Ballesteros 于2019年11月15日周五
上午11:19写道:
> Thank you very much, that worked
>
>
>
> What about passing –conf flag to pyspark?
>
>
>
> Man
Thank you very much, that worked
What about passing –conf flag to pyspark?
Manuel
From: Jeff Zhang [mailto:zjf...@gmail.com]
Sent: Friday, November 15, 2019 12:35 PM
To: users
Subject: Re: send parameters to pyspark
you can set property spark.jars
Manuel Sopena Ballesteros
mailto:manuel...@ga
you can set property spark.jars
Manuel Sopena Ballesteros 于2019年11月15日周五 上午9:30写道:
> Dear zeppelin community,
>
>
>
> I need to send some parameters to pyspark so it can find extra jars.
>
>
>
> This is an example of the parameters I need to send to pyspark:
>
>
>
> pyspark \
>
> --jars
> /sha
Dear zeppelin community,
I need to send some parameters to pyspark so it can find extra jars.
This is an example of the parameters I need to send to pyspark:
pyspark \
--jars
/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/hail-all-spark.jar
\
--conf
spark.dri