Github user michaelmior commented on the issue:

    https://github.com/apache/spark/pull/12162
  
    As best I can tell, the code that was pushed here is incomplete. However, 
Spark's default cache eviction policy is LRU. You can find the code which 
performs eviction 
[here](https://github.com/apache/spark/blob/1e82335413bc2384073ead0d6d581c862036d0f5/core/src/main/scala/org/apache/spark/storage/memory/MemoryStore.scala#L501).
 It basically just works by storing all the data in a `LinkedHashMap` 
configured to track which elements were accessed most recently.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to