In my log, I have found
mylog.2:2018-10-19 20:00:50,455 WARN [dag-scheduler-event-loop] 
(Logging.scala:66) - Dropped 3498 events from appStatus since Fri Oct 19 
19:25:05 UTC 2018.mylog.2:2018-10-19 20:02:07,053 WARN 
[dispatcher-event-loop-1] (Logging.scala:66) - Dropped 123385 events from 
appStatus since Fri Oct 19 20:00:50 UTC 2018.mylog.3:2018-10-19 19:23:42,922 
ERROR [dispatcher-event-loop-3] (Logging.scala:70) - Dropping event from queue 
appStatus. This likely means one of the listeners is too slow and cannot keep 
up with the rate at which tasks are being started by the 
scheduler.mylog.3:2018-10-19 19:23:42,928 WARN [dag-scheduler-event-loop] 
(Logging.scala:66) - Dropped 2 events from appStatus since Thu Jan 01 00:00:00 
UTC 1970.mylog.3:2018-10-19 19:25:05,822 WARN [dag-scheduler-event-loop] 
(Logging.scala:66) - Dropped 12190 events from appStatus since Fri Oct 19 
19:23:42 UTC 2018.

I will try increasing   spark.scheduler.listenerbus.eventqueue.capacity ,
Shing
    On Monday, 22 October 2018, 01:46:11 BST, Mark Hamstra 
<m...@clearstorydata.com> wrote:  
 
 Look for these log messages:
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/AsyncEventQueue.scala#L154
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/AsyncEventQueue.scala#L172


On Fri, Oct 19, 2018 at 4:42 PM Shing Hing Man <mat...@yahoo.com.invalid> wrote:

Hi,  I have just upgraded my application to Spark 2.3.2 from 2.2.1. When I run 
my Spark application in Yarn, in the executor tab of Spark UI, I see there are 
1499 active tasks.There is only 145 cores in my executors. I have not changed 
any of spark.ui.* parameters.
In Spark 2.2.1, the number of active tasks never exceeds 145 cores, the total 
no of cpu cores of  all the executors. 

Also my application takes 4 times longer to run with Spark 2.3.2 than with 
Spark 2.2.1. 

I wonder if my application is slow down because of too many active tasks. 





Thanks in advance for any assistance !
Shing 

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org
  

Reply via email to