Github user viesti commented on the pull request:
https://github.com/apache/spark/pull/8349#issuecomment-140666103
Ok. So went into hack-mode and uploaded a my git build of spark to mesos
nodes. Didn't get the S3 thing to work though (just brutally copied the script
onto slaves)
```
0% /Users/xxxx/programming/xxx/spark/bin/spark-submit --master
mesos://xxx.xxx.xxx.xxx:7077 --deploy-mode cluster s3://xxx-mesos-apps/test.py
Error: Only local python files are supported: Parsed arguments:
master mesos://xxx.xxx.xxx.xxx:7077
deployMode cluster
executorMemory null
executorCores null
totalExecutorCores null
propertiesFile null
driverMemory null
driverCores null
driverExtraClassPath null
driverExtraLibraryPath null
driverExtraJavaOptions null
supervise false
queue null
numExecutors null
files null
pyFiles null
archives null
mainClass null
primaryResource s3://xxx-mesos-apps/test.py
name test.py
childArgs []
jars null
packages null
packagesExclusions null
repositories null
verbose false
```
I guess this is still quite a bit in the works :)
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]