[ 
https://issues.apache.org/jira/browse/SPARK-19707?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Saisai Shao updated SPARK-19707:
--------------------------------
    Summary: Improve the invalid path check for sc.addJar  (was: Improve the 
invalid path handling for sc.addJar)

> Improve the invalid path check for sc.addJar
> --------------------------------------------
>
>                 Key: SPARK-19707
>                 URL: https://issues.apache.org/jira/browse/SPARK-19707
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.2.0
>            Reporter: Saisai Shao
>
> Currently in Spark there're two issues when we add jars with invalid path:
> * If the jar path is a empty string {--jar ",dummy.jar"}, then Spark will 
> resolve it to the current directory path and add to classpath / file server, 
> which is unwanted.
> * If the jar path is a invalid path (file doesn't exist), file server doesn't 
> check this and will still added file server, the exception will be thrown 
> until job is running. This local path could be checked immediately, no need 
> to wait until task running. We have similar check in {{addFile}}, but lacks 
> similar one in {{addJar}}.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to