rohit-m-99 commented on issue #6335:
URL: https://github.com/apache/hudi/issues/6335#issuecomment-1210010310

   Running into an issue now with `.option("hoodie.metadata.enable", "true")`. 
When doing so I receive the following issue alongside a FileNotFound:
   
   `It is possible the underlying files have been updated. You can explicitly 
invalidate the cache in Spark by running 'REFRESH TABLE tableName' command in 
SQL or by recreating the Dataset/DataFrame involved.`
   
   When creating the dataframe I see the potentially related warning:
   
   Metadata record for . encountered some files to be deleted which was not 
added before. Ignoring the spurious deletes as the 
`_hoodie.metadata.ignore.spurious.deletes` config is set to true
   
   Querying without the metadata table enabled causes no issues but enabling it 
seems to cause this issue. Note this happens only when not using the glob 
patter discussed above


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to