felipepessoto commented on issue #9003:
URL: 
https://github.com/apache/incubator-gluten/issues/9003#issuecomment-2795903686

   I think I found another scenario. For Merge metrics, some metrics are 2x, I 
think the metrics in CaseWhen are being incremented even when the condition is 
not true, example:
   
   ```
     test("operation metrics - gluten test project") {
       withTempDir { tempDir =>
         val dirPath = tempDir.getAbsolutePath
         spark.range(0, 3)
           .write
           .format("delta")
           .save(dirPath)
   
         val metric1 = createMetric(spark.sparkContext, "number of rows1")
         val metric2 = createMetric(spark.sparkContext, "number of rows2")
         val metric3 = createMetric(spark.sparkContext, "number of rows3")
         val metric4 = createMetric(spark.sparkContext, "number of rows4")
         val incrMetric1 = IncrementMetric(TrueLiteral, metric1)
         val incrMetric2 = IncrementMetric(TrueLiteral, metric2)
         val incrMetric3 = IncrementMetric(TrueLiteral, metric3)
         val incrMetric4 = IncrementMetric(TrueLiteral, metric4)
   
         spark.read.format("delta").load(dirPath)
           .select(Column(CaseWhen(Seq(
             (Column((rand() < 0.0001).expr).expr,  incrMetric1),
             (Column((rand() < 0.0001).expr).expr,  incrMetric2),
             (TrueLiteral,  incrMetric3)
             ),
             incrMetric4)))
           .collect()
   
         println("metric1 should be 0. Value: " + metric1.value)
         println("metric2 should be 0. Value: " + metric2.value)
         println("metric3 should be 3. Value: " + metric3.value)
         println("metric4 should be 0. Value: " + metric4.value)
       }
     }
   ```
   
   It prints:
   
   ```
   metric1 should be 0. Value: 3
   metric2 should be 0. Value: 3
   metric3 should be 3. Value: 3
   metric4 should be 0. Value: 0
   ```
   
   Another thing, if I change the Case condition `(TrueLiteral,  incrMetric3)` 
to (FalseLiteral,  incrMetric3), or to `(Column((rand() < 0.0001).expr).expr,  
incrMetric3)`,it throws Exception:
   
   ```
   org.apache.gluten.exception.GlutenNotSupportException: Not supported to map 
spark function name to substrait function name: true, class name: 
IncrementMetric.
   at 
org.apache.gluten.expression.ExpressionConverter$.getAndCheckSubstraitName(ExpressionConverter.scala:750)
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to