Github user dragos commented on the pull request:

    https://github.com/apache/spark/pull/8358#issuecomment-138833460
  
    @tnachen I copy-pasted the Spark configuration docs in my previous 
comments. Here's what I meant by other parameters though:
    
    - `spark.local.dir` Directory to use for "scratch" space in Spark
    - ` SPARK_LOCAL_DIRS` - In Spark 1.0 and later this will override the 
pervious one
    
    So, yes, the order of evaluation is one of the problems. Is there a way to 
configure `MESOS_DIRECTORY` itself? People may want to put *multiple* 
directories on different volumes, for performance. Since `MESOS_DIRECTORY` is 
just one, this won't be possible anymore.
    
    It's different in Yarn because Yarn already has this concept: it gives you 
*several* scratch directories, precisely for this purpose.
    
    The other (big) problem is dynamic allocation w/ external shuffle service 
(but we can test and only use `MESOS_DIRECTORY` when the shuffle service is 
disabled). See my comments on 
[SPARK-9708](https://issues.apache.org/jira/browse/SPARK-9708)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to