caicancai commented on PR #44357:
URL: https://github.com/apache/spark/pull/44357#issuecomment-1862067057

   > > Thanks for your reply, but v2.MySQLIntegrationSuite does not have 
related function tests (such as date_add, datediff, etc.), and may need to 
modify the MySQLDialect, you need me to add it together
   > 
   > Yes, you need modify the `MySQLDialect` and add test cases.
   
   Hello after I tried this work found that if you want to check whether the 
push-down was successful you need to use
   ```scala
   private def checkPushedInfo(df: DataFrame, expectedPlanFragment: String*): 
Unit = {
     withSQLConf(SQLConf.MAX_METADATA_STRING_LENGTH.key -> "1000") {
       df.queryExecution.optimizedPlan.collect {
         case _: DataSourceV2ScanRelation =>
           checkKeywordsExistsInExplain(df, expectedPlanFragment: _*)
       }
     }
   }
   ```
   But checkKeywordsExistsInExplain is in org. Apache. Spark. SQL. 
ExplainSuiteHelper neutralization MySQLIntegrationSuite is not belong to the 
same module, In this case, do I need to add these files (which may change a 
lot), or there are other solutions, thank you


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to