[ https://issues.apache.org/jira/browse/SPARK-39633?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Huaxin Gao updated SPARK-39633: ------------------------------- Issue Type: Improvement (was: Bug) > Dataframe options for time travel via `timestampAsOf` should respect both > formats of specifying timestamp > --------------------------------------------------------------------------------------------------------- > > Key: SPARK-39633 > URL: https://issues.apache.org/jira/browse/SPARK-39633 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 3.3.0 > Reporter: Prashant Singh > Priority: Minor > > presently spark sql query for time travel like : > {{SELECT * from \{table} TIMESTAMP AS OF 1548751078 }} > works correctly, which is what is specified in sql grammar as well (((FOR > SYSTEM_VERSION) | VERSION) AS OF version=(INTEGER_VALUE | STRING)), but when > trying to do the same via dataframe option `timestampAsOf` the code fails > with : > {quote}[info] org.apache.spark.sql.AnalysisException: '1548751078' is not a > valid timestamp expression for time travel. > [info] at > org.apache.spark.sql.errors.QueryCompilationErrors$.invalidTimestampExprForTimeTravel(QueryCompilationErrors.scala:2413) > [info] at > org.apache.spark.sql.catalyst.analysis.TimeTravelSpec$.create(TimeTravelSpec.scala:55) > [info] at > org.apache.spark.sql.execution.datasources.v2.DataSourceV2Utils$.loadV2Source(DataSourceV2Utils.scala:128) > [info] at > org.apache.spark.sql.DataFrameReader.$anonfun$load$1(DataFrameReader.scala:209) > [info] at scala.Option.flatMap(Option.scala:271) > [info] at > org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:207) > [info] at > org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:171) > [info] at > org.apache.spark.sql.connector.SupportsCatalogOptionsSuite.load(SupportsCatalogOptionsSuite.scala:365) > [info] at > org.apache.spark.sql.connector.SupportsCatalogOptionsSuite.$anonfun$new$33(SupportsCatalogOptionsSuite.scala:329) > [info] at org.apache.spark.sql.QueryTest.checkAnswer(QueryTest.scala:133) > [info] at org.apache.spark.sql.QueryTest.checkAnswer(QueryTest.scala:158) > [info] at > org.apache.spark.sql.connector.SupportsCatalogOptionsSuite.$anonfun$new$30(SupportsCatalogOptionsSuite.scala:329) > [info] at > scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) > [info] at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1490) > [info] at > org.apache.spark.sql.test.SQLTestUtilsBase.withTable(SQLTestUtils.scala:306) > [info] at > org.apache.spark.sql.test.SQLTestUtilsBase.withTable$(SQLTestUtils.scala:304) > [info] at > org.apache.spark.sql.connector.SupportsCatalogOptionsSuite.withTable(SupportsCatalogOptionsSuite.scala:44) > [info] at > org.apache.spark.sql.connector.SupportsCatalogOptionsSuite.$anonfun$new$26(SupportsCatalogOptionsSuite.scala:309) > [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) > [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) > [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) > [info] at org.scalatest.Transformer.apply(Transformer.scala:22) > [info] at org.scalatest.Transformer.apply(Transformer.scala:20) > [info] at > org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:190) > [info] at > org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:203) > [info] at > org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:188) > [info] at > org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:200) > [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) > [info] at > org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:200) > [info] at > org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:182) > {quote} -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org