bhasudha commented on issue #1860:
URL: https://github.com/apache/hudi/issues/1860#issuecomment-663036038


   I am not sure if this has to do with Spark caching the table metadata. In 
any case, could you try adding the  conf 
`spark.sql.hive.convertMetastoreParquet` to false like here - 
https://hudi.apache.org/docs/docker_demo.html#step-4-b-run-spark-sql-queries 
and try again ? 
   
   Were you specifically testing the Spark Datasource API ? Since your table is 
already registered to Hive, directly querying the hive table table using Spark 
SQL would also work. 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to