viirya commented on issue #123:
URL: 
https://github.com/apache/arrow-datafusion-comet/issues/123#issuecomment-1965975507

   I'm not sure if this makes sense. I tend to think this doesn't make too much 
sense at this moment.
   
   * It is not something ready to use. As you said, we need to implement a 
sqllogictest test executor for Spark to talk to Spark connect. I think this 
makes thing complicated. For the test, we need Spark server, Spark connect 
client, sqllogictest text executor, sqllogictest... too many components which 
increase maintenance effort. 
   * The slt files need to be run with both Spark and Comet. Although we have 
unit tests also doing like that, the number is limited.
   * We will soon have Spark test pipelines which cover Spark compatibility 
which leverage existing Spark tests. I think Spark test itself has most correct 
and highest coverage to make sure Comet's Spark compatibility. If there are 
very corner cases which are missing in Spark tests, I think these cases should 
be less. We probably can add the tests directly into Spark or just in Comet as 
unit test.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscr...@arrow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to