Github user srowen commented on the pull request:

    https://github.com/apache/spark/pull/5261#issuecomment-87628770
  
    (You could give this a more specific title than "Update load-spark-env.sh")
    This is borderline important enough for a JIRA, but I think we might 
consider this a minor add-on fix for SPARK-4924, maybe.
    
    I'm not sure about this. For example `spark-class` sources this script with 
`. "$SPARK_HOME"/bin/load-spark-env.sh` and `pyspark` does similarly. So these 
have `SPARK_HOME` set.
    
    However `run-example` uses `. "$FWDIR"/bin/load-spark-env.sh`, and scripts 
in `sbin` use `. "$SPARK_PREFIX/bin/load-spark-env.sh"` Clearly they don't 
expect `SPARK_HOME` necessarily.
    
    CC @vanzin since this used to refer to `FWDIR` actually:
    
https://github.com/apache/spark/commit/517975d89d40a77c7186f488547eed11f79c1e97
    
    The lines you reference don't exist in 1.3.0 though. Are you sure you're 
using 1.3.0?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to