SaurabhChawla100 commented on pull request #29413:
URL: https://github.com/apache/spark/pull/29413#issuecomment-672956203


   
   
   
   > > > I am wondering if we don't want to drop any events, why don't we just 
set the capacity to Integer.MAX_VALUE? The LinkedBlockingQueue doesn't really 
allocate that much memory at the beginning.
   > > 
   > > 
   > > This will cause Driver to OOM for long Running Jobs if there are large 
number of events came at point of time, since these event takes driver memory 
when present on queue.
   > 
   > Could `VariableLinkedBlockingQueue` avoid OOM in that case?
   
   Using this VariableLinkedBlockingQueue, we are increasing certain percentage 
threshold for size of the queue at run time, Which is less aggressive compare 
to keeping the Integer.MAX_VALUE, Also I am planning to add the validation of 
spark driver memory to increase the size of the VariableLinkedBlockingQueue
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to