[ 
https://issues.apache.org/jira/browse/SPARK-21377?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16083060#comment-16083060
 ] 

Saisai Shao commented on SPARK-21377:
-------------------------------------

Thanks [~vanzin] for your comment.

Your comment is correct, specifying {{\--packages}} will not add jars to AM, my 
original thought is to main jar and secondary jars automatically into AM 
classpath, but this will break the usage of "spark.driver.userClassPathFirst". 
So my proposal is to manually specify AM extra classpath with 
"spark.yarn.am.extraClassPath" manually, for example specifying HBase classpath 
with this configuration. This requires HBase dependencies existed in cluster, 
but it may not impact user application's classpath.

> Add a new configuration to extend AM classpath in yarn client mode
> ------------------------------------------------------------------
>
>                 Key: SPARK-21377
>                 URL: https://issues.apache.org/jira/browse/SPARK-21377
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>    Affects Versions: 2.2.0
>            Reporter: Yesha Vora
>            Priority: Minor
>
> In this issue we have a long running Spark application with secure HBase, 
> which requires {{HBaseCredentialProvider}} to get tokens periodically, we 
> specify HBase related jars with {{\--packages}}, but these dependencies are 
> not added into AM classpath, so when {{HBaseCredentialProvider}} tries to 
> initialize HBase connections to get tokens, it will be failed.
> Currently because jars specified with {{\--jars}} or {{\--packages}} are not 
> added into AM classpath, the only way to extend AM classpath is to use 
> "spark.driver.extraClassPath" which supposed to be used in yarn cluster mode.
> So here we should figure out a solution  either to put these dependencies to 
> AM classpath or to extend AM classpath with correct configuration.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to