Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19135#discussion_r139164432
  
    --- Diff: 
core/src/main/scala/org/apache/spark/storage/memory/MemoryStore.scala ---
    @@ -190,11 +190,11 @@ private[spark] class MemoryStore(
         // Initial per-task memory to request for unrolling blocks (bytes).
         val initialMemoryThreshold = unrollMemoryThreshold
         // How often to check whether we need to request more memory
    -    val memoryCheckPeriod = 16
    +    val memoryCheckPeriod = 
conf.getLong("spark.storage.unrollMemoryCheckPeriod", 16)
    --- End diff --
    
    we should move these 2 configs to `org.apache.spark.internal.config` and 
add some documents. They should be internal config I think.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to