dongjoon-hyun commented on a change in pull request #23277: [SPARK-26327][SQL]
Metrics in FileSourceScanExec not update correctly
URL: https://github.com/apache/spark/pull/23277#discussion_r240473934
##########
File path:
sql/core/src/main/scala/org/apache/spark/sql/execution/DataSourceScanExec.scala
##########
@@ -316,7 +313,7 @@ case class FileSourceScanExec(
override lazy val metrics =
Map("numOutputRows" -> SQLMetrics.createMetric(sparkContext, "number of
output rows"),
"numFiles" -> SQLMetrics.createMetric(sparkContext, "number of files"),
- "metadataTime" -> SQLMetrics.createMetric(sparkContext, "metadata time
(ms)"),
+ "fileListingTime" -> SQLMetrics.createMetric(sparkContext, "file listing
time (ms)"),
Review comment:
Should we change this metric name? It has a clearer meaning, but the PR
title is claiming only to change `update` location, not the naming.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]