Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/1392#issuecomment-49353609
  
    Hey @YanTangZhai, on second thought I think we should keep the config, but 
not set it by default like we do currently. The user may have multiple 
installations of Spark on the same Worker machine, and "spark.home" previously 
provided them a way to do that. We should keep this functionality, but make it 
optional as opposed to forcing it on them.
    
    Also, since we no longer need "spark.home" anymore, it would be good to 
remove all occurrences of it to remove confusion (exception for backwards 
compatibility). However, this is slightly tricky because Mesos handles this 
differently from other modes.
    
    So I suggest this: I will take over from here, because this change seems a 
little more involved than we originally imagined it to be. How does that sound?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

Reply via email to