[ 
https://issues.apache.org/jira/browse/SPARK-26760?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16760534#comment-16760534
 ] 

ABHISHEK KUMAR GUPTA commented on SPARK-26760:
----------------------------------------------

spark.ui.liveUpdate.period after setting this to 0 for more task 
 # Submit a Job sc.parallelize(1 to 10000,116000).count() it is not correct in 
UI. Still display more number of tasks.

> [Spark Incorrect display in SPARK UI Executor Tab when number of cores is 4 
> and Active Task display as 5 in Executor Tab of SPARK UI]
> -------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-26760
>                 URL: https://issues.apache.org/jira/browse/SPARK-26760
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.4.0
>         Environment: Spark 2.4
>            Reporter: ABHISHEK KUMAR GUPTA
>            Priority: Major
>         Attachments: SPARK-26760.png
>
>
> Steps:
>  # Launch Spark Shell 
>  # bin/spark-shell --master yarn  --conf spark.dynamicAllocation.enabled=true 
> --conf spark.dynamicAllocation.initialExecutors=3 --conf 
> spark.dynamicAllocation.minExecutors=1 --conf 
> spark.dynamicAllocation.executorIdleTimeout=60s --conf 
> spark.dynamicAllocation.maxExecutors=5
>  # Submit a Job sc.parallelize(1 to 10000,116000).count()
>  # Check the YARN UI Executor Tab for the RUNNING application
>  # UI display as Number of cores 4 and Active Tasks column shows as 5
> Expected:
> It Number of Active Tasks should be same as Number of Cores.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to