Hi , I have oracle table in which has column schema is : DATA_DATE DATE something like 31-MAR-02
I am trying to retrieve data from oracle using spark-sql-2.4.1 version. I tried to set the JdbcOptions as below : .option("lowerBound", "2002-03-31 00:00:00"); .option("upperBound", "2019-05-01 23:59:59"); .option("timestampFormat", "yyyy-mm-dd hh:mm:ss"); .option("partitionColumn", "DATA_DATE"); .option("numPartitions", 240); But gives error : java.lang.IllegalArgumentException: Timestamp format must be yyyy-mm-dd hh:mm:ss[.fffffffff] at java.sql.Timestamp.valueOf(Timestamp.java:204) at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.toInternalBoundValue(JDBCRelation.scala:179) Any clue how it need to be handled/fixed? https://stackoverflow.com/questions/56020103/how-to-pass-date-timestamp-as-lowerbound-upperbound-in-spark-sql-2-4-1v Any help is highly appreciated and thankful. Regards, Shyam