Github user holdenk commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21977#discussion_r209329689
  
    --- Diff: python/pyspark/worker.py ---
    @@ -259,6 +260,26 @@ def main(infile, outfile):
                                  "PYSPARK_DRIVER_PYTHON are correctly set.") %
                                 ("%d.%d" % sys.version_info[:2], version))
     
    +        # set up memory limits
    +        memory_limit_mb = int(os.environ.get('PYSPARK_EXECUTOR_MEMORY_MB', 
"-1"))
    +        total_memory = resource.RLIMIT_AS
    +        try:
    +            (total_memory_limit, max_total_memory) = 
resource.getrlimit(total_memory)
    +            msg = "Current mem: {0} of max 
{1}\n".format(total_memory_limit, max_total_memory)
    +            print(msg, file=sys.stderr)
    +
    +            if memory_limit_mb > 0 and total_memory_limit == 
resource.RLIM_INFINITY:
    --- End diff --
    
    From our discussion in 
https://github.com/apache/spark/pull/21977#discussion_r208339172 I thought we 
were going to do this if there was no limit or the limit requested was lower?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to