Github user CodingCat commented on the pull request:

    https://github.com/apache/incubator-spark/pull/599#discussion_r9767385
  
    Hi, @aarondav, just a bit confused. from the code
    
    private[spark] val executorMemory = conf.getOption("spark.executor.memory")
        .orElse(Option(System.getenv("SPARK_MEM")))
        .map(Utils.memoryStringToMb)
        .getOrElse(512)
    
    SPARK_MEM is to set the memory used by the executor, which has been done in 
this PR, 
    
    what you are proposing is to control the memory used by the driver. I think 
it is hard to achieve since users may run spark-shell in a machine out of the 
control of Spark.

Reply via email to