[ https://issues.apache.org/jira/browse/SPARK-32597?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17176037#comment-17176037 ]
Apache Spark commented on SPARK-32597: -------------------------------------- User 'SaurabhChawla100' has created a pull request for this issue: https://github.com/apache/spark/pull/29413 > Tune Event Drop in Async Event Queue > ------------------------------------ > > Key: SPARK-32597 > URL: https://issues.apache.org/jira/browse/SPARK-32597 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Affects Versions: 3.0.0 > Reporter: Saurabh Chawla > Priority: Minor > > There are scenarios where we have seen the event drop in spark, resulting in > the inconsistent state for the spark Application(some time application is > hung state). > > For example - This can be due to the large number of parallel task > processing. Producer thread keeps on adding the events to the > Queue and Consumer thread is consuming the events at the slower rate compare > to the Producer adding in the queue. Resulting the > Queue to reach max size and events get dropped from that. > There are times if Queue Size would be little bit higher like 10 percent (or > 20 percent) extra of the existing Queue size, the events > can be processed preventing the event drop at that point of time. > As per the current architecture size of event Queue can be configured at > start of the application by setting > spark.scheduler.listenerbus.eventqueue.capacity. Once this is set there is > fixed size event Queue(LinkedBlockingQueue), which cannot be changed > at run time to accommodate some extra events before dropping event from the > Queue > This Jira for adding a support of VariableLinkedBlockingQueue to tune the > dropping of the events. > VariableLinkedBlockingQueue -> > https://www.rabbitmq.com/releases/rabbitmq-java-client/v3.5.4/rabbitmq-java-client-javadoc-3.5.4/com/rabbitmq/client/impl/VariableLinkedBlockingQueue.html -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org