GitHub user HyukjinKwon opened a pull request:
https://github.com/apache/spark/pull/23055
[SPARK-26080][SQL] Disable 'spark.executor.pyspark.memory' always on Windows
## What changes were proposed in this pull request?
`resource` package is a Unit specific package. See
https://docs.python.org/2/library/resource.html and
https://docs.python.org/3/library/resource.html.
Note that we document Windows support:
> Spark runs on both Windows and UNIX-like systems (e.g. Linux, Mac OS).
This should be backported into branch-2.4 to restore Windows support in
Spark 2.4.1.
## How was this patch tested?
Manually mocking the changed logics.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/HyukjinKwon/spark SPARK-26080
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/23055.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #23055
----
commit 2d3315a7dab429abc4d9ef5ed7f8f5484e8421f1
Author: hyukjinkwon <gurwls223@...>
Date: 2018-11-16T01:46:31Z
Disable 'spark.executor.pyspark.memory' on Windows always
----
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]