Github user rdblue commented on a diff in the pull request:
https://github.com/apache/spark/pull/21977#discussion_r208339172
--- Diff: python/pyspark/worker.py ---
@@ -259,6 +260,26 @@ def main(infile, outfile):
"PYSPARK_DRIVER_PYTHON are correctly set.") %
("%d.%d" % sys.version_info[:2], version))
+ # set up memory limits
+ memory_limit_mb = int(os.environ.get('PYSPARK_EXECUTOR_MEMORY_MB',
"-1"))
+ total_memory = resource.RLIMIT_AS
+ try:
+ (total_memory_limit, max_total_memory) =
resource.getrlimit(total_memory)
+ msg = "Current mem: {0} of max
{1}\n".format(total_memory_limit, max_total_memory)
+ sys.stderr.write()
+
+ if memory_limit_mb > 0 and total_memory_limit < 0:
--- End diff --
Works for me. I'll update this.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]