[
https://issues.apache.org/jira/browse/SPARK-17306?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15448464#comment-15448464
]
Sean Owen commented on SPARK-17306:
-----------------------------------
[~thunterdb] it looks like you might have added the original implementation. I
also don't see that compressThreshold is ever used. Should it be checked vs
count in the insert method, where compress() would be invoked if it exceeds the
threshold?
> Memory leak in QuantileSummaries
> --------------------------------
>
> Key: SPARK-17306
> URL: https://issues.apache.org/jira/browse/SPARK-17306
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Reporter: Sean Zhong
>
> compressThreshold was not referenced anywhere
> {code}
> class QuantileSummaries(
> val compressThreshold: Int,
> val relativeError: Double,
> val sampled: ArrayBuffer[Stats] = ArrayBuffer.empty,
> private[stat] var count: Long = 0L,
> val headSampled: ArrayBuffer[Double] = ArrayBuffer.empty) extends
> Serializable
> {code}
> And, it causes memory leak, QuantileSummaries takes unbounded memory
> {code}
> val summary = new QuantileSummaries(10000, relativeError = 0.001)
> // Results in creating an array of size 100000000 !!!
> (1 to 100000000).foreach(summary.insert(_))
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]