[ https://issues.apache.org/jira/browse/SPARK-43848?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
chengxingfu updated SPARK-43848: -------------------------------- Description: In the Web UI when I’m running a query with limit clause, the duration of WholeStageCodegen operator is always 0 ms. the corresponding feature is SPARK-13916, commitid: 76958d820f57d23e3cbb5b7205c680a5daea0499 . durationMs update only when we iterate the last row of partition, but when we only iterate a few rows, the duration will always be 0 ms below code will repetition the issue: spark.sql("use tpcds1g") spark.sql(""" select i_item_sk from item limit 100 """).collect was: In the Web UI when I’m running a query with limit clause, the duration of WholeStageCodegen operator is always 0 ms. the corresponding feature is SPARK-13916, commitid: 5e86e926 . durationMs update only when we iterate the last row of partition, but when we only iterate a few rows, the duration will always be 0 ms below code will repetition the issue: spark.sql("use tpcds1g") spark.sql(""" select i_item_sk from item limit 100 """).collect > Web UI WholeStageCodegen duration is 0 ms > ----------------------------------------- > > Key: SPARK-43848 > URL: https://issues.apache.org/jira/browse/SPARK-43848 > Project: Spark > Issue Type: Bug > Components: Web UI > Affects Versions: 3.2.0 > Environment: spark local mode > Reporter: chengxingfu > Priority: Major > Attachments: 0ms.jpg > > > In the Web UI when I’m running a query with limit clause, the duration of > WholeStageCodegen operator is always 0 ms. > the corresponding feature is SPARK-13916, commitid: > 76958d820f57d23e3cbb5b7205c680a5daea0499 . durationMs update only when we > iterate the last row of partition, but when we only iterate a few rows, the > duration will always be 0 ms > below code will repetition the issue: > spark.sql("use tpcds1g") > spark.sql(""" > select i_item_sk from item > limit 100 > """).collect -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org