Github user tgravescs commented on the issue:

    https://github.com/apache/spark/pull/15009
  
    I'm not worried about the multiple spark contexts, I say we just say this 
is cluster mode only since we don't support multiple spark context for 
anything. 
    
    Thanks for pointing out the multiple apps launched though, I hadn't thought 
of that use case since we were thinking about oozie launcher which only 
launches one per process.
    
    Changing SparkSubmit to not use system properties looks like a pretty big 
task. We then need to figure out how things will get passed. I would think 
passing on the command line could get to big so then you are looking at writing 
a file and then with that you have to configure where that file goes, etc.  
Really the SparkSubmit front end is a bit clunky since its really like we have 
a Client calling another Client which then finally does the work.  Ideally we 
would get rid of that second wrapper.   Did you have ideas on this?
    
    one other option would be for us just to only support thread mode if 
launching a single app for cluster mode.  I'm not super fond of it as it could 
confuse users but it at least gets us more functionality then now and we can do 
the system properties change separate.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to