[ https://issues.apache.org/jira/browse/SPARK-24063?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16483796#comment-16483796 ]
Apache Spark commented on SPARK-24063: -------------------------------------- User 'efimpoberezkin' has created a pull request for this issue: https://github.com/apache/spark/pull/21392 > Control maximum epoch backlog > ----------------------------- > > Key: SPARK-24063 > URL: https://issues.apache.org/jira/browse/SPARK-24063 > Project: Spark > Issue Type: Sub-task > Components: Structured Streaming > Affects Versions: 2.4.0 > Reporter: Efim Poberezkin > Priority: Major > > As pointed out by [~joseph.torres] in > [https://github.com/apache/spark/pull/20936], both epoch queue and > commits/offsets maps are unbounded by the number of waiting epochs. According > to his proposal, we should introduce some configuration option for maximum > epoch backlog and report an error if the number of waiting epochs exceeds it. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org