Github user rdblue commented on a diff in the pull request:
https://github.com/apache/spark/pull/21977#discussion_r212714476
--- Diff:
core/src/main/scala/org/apache/spark/internal/config/package.scala ---
@@ -114,6 +114,10 @@ package object config {
.checkValue(_ >= 0, "The off-heap memory size must not be negative")
.createWithDefault(0)
+ private[spark] val PYSPARK_EXECUTOR_MEMORY =
ConfigBuilder("spark.executor.pyspark.memory")
--- End diff --
Yes, it should. I'll fix it.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]