[
https://issues.apache.org/jira/browse/SPARK-8622?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14605921#comment-14605921
]
Baswaraj commented on SPARK-8622:
---------------------------------
Thats what i mean. Jars specified by --jars are not put on classpath, but are
in working directory of executor. I am expecting either jars to be on classpath
or working directory to be on classpath.
In 1.3.0, working directory is in classpath.
In 1.3.1 + neither jars nor working directory on classpath.
> Spark 1.3.1 and 1.4.0 doesn't put executor working directory on executor
> classpath
> ----------------------------------------------------------------------------------
>
> Key: SPARK-8622
> URL: https://issues.apache.org/jira/browse/SPARK-8622
> Project: Spark
> Issue Type: Bug
> Components: Deploy
> Affects Versions: 1.3.1, 1.4.0
> Reporter: Baswaraj
>
> I ran into an issue that executor not able to pickup my configs/ function
> from my custom jar in standalone (client/cluster) deploy mode. I have used
> spark-submit --Jar option to specify all my jars and configs to be used by
> executors.
> all these files are placed in working directory of executor, but not in
> executor classpath. Also, executor working directory is not in executor
> classpath.
> I am expecting executor to find all files specified in spark-submit --jar
> options .
> In spark 1.3.0 executor working directory is in executor classpath, so app
> runs successfully.
> To successfully run my application with spark 1.3.1 +, i have to use
> following option (conf/spark-defaults.conf)
> spark.executor.extraClassPath .
> Please advice.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]