Dear zeppelin community,

I need to send some parameters to pyspark so it can find extra jars.

This is an example of the parameters I need to send to pyspark:

pyspark \
  --jars 
/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/hail-all-spark.jar
 \
  --conf 
spark.driver.extraClassPath=/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/hail-all-spark.jar
 \
  --conf 
spark.executor.extraClassPath=/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/hail-all-spark.jar
 \
  --conf spark.serializer=org.apache.spark.serializer.KryoSerializer \
  --conf spark.kryo.registrator=is.hail.kryo.HailKryoRegistrator

How could I configure my spark interpreter to do this?

Thank you very much
NOTICE
Please consider the environment before printing this email. This message and 
any attachments are intended for the addressee named and may contain legally 
privileged/confidential/copyright information. If you are not the intended 
recipient, you should not read, use, disclose, copy or distribute this 
communication. If you have received this message in error please notify us at 
once by return email and then delete both messages. We accept no liability for 
the distribution of viruses or similar in electronic communications. This 
notice should not be removed.

Reply via email to