davintjong-db commented on code in PR #44649:
URL: https://github.com/apache/spark/pull/44649#discussion_r1451023537


##########
sql/core/src/main/scala/org/apache/spark/sql/execution/metric/SQLMetrics.scala:
##########
@@ -93,8 +91,8 @@ class SQLMetric(
   def +=(v: Long): Unit = add(v)
 
   // _value may be invalid, in many cases being -1. We should not expose it to 
the user
-  // and instead return defaultValidValue.
-  override def value: Long = if (!isValid) defaultValidValue else _value
+  // and instead return 0.
+  override def value: Long = if (!isValid) 0 else _value

Review Comment:
   Hm, it seems like tests fail because of a test with a positive initValue. 
   ```
   assert(SQLMetrics.createNanoTimingMetric(sparkContext, name = "m", initValue 
= 5).value === 5)
   ```
   I don't think we ever use a positive initValue so we could either remove 
this or stick with the original check `if(_value < 0)`.
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to