Github user rdblue commented on a diff in the pull request:

    https://github.com/apache/spark/pull/23055#discussion_r234080290
  
    --- Diff: python/pyspark/worker.py ---
    @@ -268,9 +272,11 @@ def main(infile, outfile):
     
             # set up memory limits
             memory_limit_mb = int(os.environ.get('PYSPARK_EXECUTOR_MEMORY_MB', 
"-1"))
    -        total_memory = resource.RLIMIT_AS
    -        try:
    -            if memory_limit_mb > 0:
    +        # 'PYSPARK_EXECUTOR_MEMORY_MB' should be undefined on Windows 
because it depends on
    +        # resource package which is a Unix specific package.
    +        if memory_limit_mb > 0:
    --- End diff --
    
    It seems brittle to disable this on the JVM side and rely on it here. Can 
we also set a flag in the ImportError case and also check that here?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to