Github user CodingCat commented on the pull request:

    https://github.com/apache/spark/pull/1244#issuecomment-49153988
  
    @andrewor14 yeah, I agree with you, I just thought in somewhere (document 
in the earlier versions? I cannot find it now), the user has to set this env 
variable? so I said prioritizing worker side SPARK_HOME, if this is not set, 
Spark will try to read application setup about SPARK_HOME (which may generates 
error if the directory structure is not the same)....
    
    
    I also noticed this JIRA https://issues.apache.org/jira/browse/SPARK-2454 
(left some comments there)



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

Reply via email to