milastdbx commented on code in PR #47666:
URL: https://github.com/apache/spark/pull/47666#discussion_r1764707190
##########
connector/docker-integration-tests/src/test/scala/org/apache/spark/sql/jdbc/v2/PostgresIntegrationSuite.scala:
##########
@@ -123,4 +131,33 @@ class PostgresIntegrationSuite extends
DockerJDBCIntegrationV2Suite with V2JDBCT
)
}
}
+
+ test("SPARK-49162: Push down filter date_trunc function") {
Review Comment:
reading the documentation for pgsql:
https://www.postgresql.org/docs/current/functions-datetime.html#FUNCTIONS-DATETIME-TRUNC
and databricks:
https://docs.databricks.com/en/sql/language-manual/functions/date_trunc.html
I see that they support `microseconds` while we support `microsecond`.
Can we test for all our supported precisions ? ALso can we try different
casings ?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]