[
https://issues.apache.org/jira/browse/SPARK-12559?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16085746#comment-16085746
]
Stavros Kontopoulos commented on SPARK-12559:
---------------------------------------------
I did check it it does not work because standalone mode uses
org.apache.spark.deploy.worker.DriverWrapper to launch the main of the jar so
it does not go though the submit code. I will clone this bug for mesos only and
will create two jira issues: refactor the spark-submit code to be re-usable
from standalone and another for the distributed cache. What do you think?
> Cluster mode doesn't work with --packages
> -----------------------------------------
>
> Key: SPARK-12559
> URL: https://issues.apache.org/jira/browse/SPARK-12559
> Project: Spark
> Issue Type: Bug
> Components: Spark Submit
> Affects Versions: 1.3.0
> Reporter: Andrew Or
>
> From the mailing list:
> {quote}
> Another problem I ran into that you also might is that --packages doesn't
> work with --deploy-mode cluster. It downloads the packages to a temporary
> location on the node running spark-submit, then passes those paths to the
> node that is running the Driver, but since that isn't the same machine, it
> can't find anything and fails. The driver process *should* be the one
> doing the downloading, but it isn't. I ended up having to create a fat JAR
> with all of the dependencies to get around that one.
> {quote}
> The problem is that we currently don't upload jars to the cluster. It seems
> to fix this we either (1) do upload jars, or (2) just run the packages code
> on the driver side. I slightly prefer (2) because it's simpler.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]