[ https://issues.apache.org/jira/browse/SPARK-46077?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Max Gekk reassigned SPARK-46077: -------------------------------- Assignee: Pablo Langa Blanco > Error in postgresql when pushing down filter by timestamp_ntz field > ------------------------------------------------------------------- > > Key: SPARK-46077 > URL: https://issues.apache.org/jira/browse/SPARK-46077 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 3.5.0 > Reporter: Marina Krasilnikova > Assignee: Pablo Langa Blanco > Priority: Minor > Labels: pull-request-available > > code to reproduce: > SparkSession sparkSession = SparkSession > .builder() > .appName("test-app") > .master("local[*]") > .config("spark.sql.timestampType", "TIMESTAMP_NTZ") > .getOrCreate(); > String url = "..."; > String catalogPropPrefix = "spark.sql.catalog.myc"; > sparkSession.conf().set(catalogPropPrefix, JDBCTableCatalog.class.getName()); > sparkSession.conf().set(catalogPropPrefix + ".url", url); > Map<String, String> options = new HashMap<>(); > options.put("driver", "org.postgresql.Driver"); > // options.put("pushDownPredicate", "false"); it works fine if this line is > uncommented > Dataset<Row> dataset = sparkSession.read() > .options(options) > .table("myc.demo.`My table`"); > dataset.createOrReplaceTempView("view1"); > String sql = "select * from view1 where `my date` = '2021-04-01 00:00:00'"; > Dataset<Row> result = sparkSession.sql(sql); > result.show(); > result.printSchema(); > Field `my date` is of type timestamp. This code results in > org.postgresql.util.PSQLException syntax error > > > String sql = "select * from view1 where `my date` = to_timestamp('2021-04-01 > 00:00:00', 'yyyy-MM-dd HH:mm:ss')"; // this query also doesn't work > String sql = "select * from view1 where `my date` = date_trunc('DAY', > to_timestamp('2021-04-01 00:00:00', 'yyyy-MM-dd HH:mm:ss'))"; // but this is > OK > > Is it a bug or I got something wrong? -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org