marblejenka commented on pull request #28200:
URL: https://github.com/apache/spark/pull/28200#issuecomment-618902470


   @tgravescs Certainly, we can do the same thing with SPARK_JARS_DIR by having 
multiple SPARK_HOME. One of the motivations for this PR is to have a single 
SPARK_CONF_DIR. Also, regarding spark.yarn.jars and spark.yarn.archive, it does 
not change the dependency of spark-shell process for example, so it is not 
sufficient for my usecase.
   @nchammas Thank you for sharing your use case. However, I think that 
SPARK_DIST_CLASSPATH may work. SPARK_DIST_CLASSPATH allows you to add 
additional jars to your classpath. I'm not a user of pyspark, so there might be 
some misunderstanding.
   
   Overall, SPARK_JARS_DIR is useful when you want to manage multiple sets of 
jars.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to