Github user vanzin commented on the pull request:

    https://github.com/apache/spark/pull/12474#issuecomment-217981329
  
    I sort of like the idea of exposing the `ProcessBuilder`. While 
particularly for the working directory we could add a setting, it's cleaner to 
just use the java API; and for the streams, Java 7 has added some nice APIs 
that would be more cumbersome to wrap around configs.
    
    The thing that's missing from this PR is to add the same treatment for 
`startApplication`. I'm a little wary of suggesting an overloaded method that 
takes a `ProcessBuilder` argument (aside from the list of listeners), because 
that means you could pass in any instance, not necessarily one that will start 
a Spark application. But that's the easiest way to do it, and maybe the case 
I'm thinking can be filed under "user error".


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to