li-boxuan opened a new issue, #9901: URL: https://github.com/apache/incubator-gluten/issues/9901
### Backend VL (Velox) ### Bug description I added a unit test in https://github.com/apache/incubator-gluten/blob/b865e5f7eb3c16bd97a3c99d6de98e5cfb3e6084/gluten-ut/spark35/src/test/scala/org/apache/spark/sql/execution/GlutenStreamingQuerySuite.scala and it failed ONLY WHEN `RAS` is enabled. Error logs: ``` 2025-06-06T18:42:55.4541087Z GlutenStreamingQuerySuite: 2025-06-06T18:42:55.4959057Z 18:42:55.495 WARN org.apache.spark.sql.execution.streaming.ResolveWriteToStream: Temporary checkpoint location created which is deleted normally when the query didn't fail: /tmp/temporary-40bbc770-4310-4878-9d7b-6b05c59e5ae0. If it's required to delete it under any circumstances, please set spark.sql.streaming.forceDeleteTempCheckpointLocation to true. Important to know deleting temp checkpoint folder is best effort. 2025-06-06T18:42:55.4961626Z 2025-06-06T18:42:55.4962614Z 18:42:55.495 WARN org.apache.spark.sql.execution.streaming.ResolveWriteToStream: spark.sql.adaptive.enabled is not supported in streaming DataFrames/Datasets and will be disabled. 2025-06-06T18:42:55.4963674Z 2025-06-06T18:42:56.7794827Z - input row calculation with same V1 source used twice in self-join *** FAILED *** 2025-06-06T18:42:56.7796438Z 40 did not equal 20 (GlutenStreamingQuerySuite.scala:48) ``` The test checks the `numInputRows` metrics of a self-join. The correct number shall be 20, but gluten (under RAS mode) gives 40. ### Gluten version _No response_ ### Spark version None ### Spark configurations _No response_ ### System information _No response_ ### Relevant logs ```bash ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
