[
https://issues.apache.org/jira/browse/SPARK-4160?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Marcelo Vanzin updated SPARK-4160:
----------------------------------
Issue Type: Improvement (was: Bug)
> Standalone cluster mode does not upload all needed jars to driver node
> ----------------------------------------------------------------------
>
> Key: SPARK-4160
> URL: https://issues.apache.org/jira/browse/SPARK-4160
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core
> Affects Versions: 1.2.0
> Reporter: Marcelo Vanzin
>
> If you look at the code in {{DriverRunner.scala}}, there is code to download
> the main application jar from the launcher node. But that's the only jar
> that's downloaded - if the driver depends on one of the jars or files
> specified via {{spark-submit --jars <list> --files <list>}}, it won't be able
> to run.
> It should be possible to use the same mechanism to distribute the other files
> to the driver node, even if that's not the most efficient way of doing it.
> That way, at least, you don't need any external dependencies to be able to
> distribute the files.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]