Github user vanzin commented on the pull request:

    https://github.com/apache/spark/pull/1845#issuecomment-52675629
  
    Hi @andrewor14,
    
    First thanks for working on this. But I wonder if a better approach 
wouldn't be to, instead, use this bootstrap class to just create the actual 
SparkSubmit command line and still execute the SparkSubmit VM directly from 
bash?
    
    Something like this:
    
        javaArgs=$(java SparkSubmitBootstrapper args)
        exec java $javaArgs
    
    That way we can keep all the logic regarding figuring out the command line 
to be executed in Scala code, instead of the current split-brain state. The 
bash script could be greatly simplified that way, I think. It also has the 
advantage of just keeping one VM alive while the app is running.
    
    What do you think?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to