garyli1019 commented on issue #4887:
URL: https://github.com/apache/hudi/issues/4887#issuecomment-1066544282


   @fisser001 Impala actually doesn't support hudi hive sync, that why we need 
to manually create the external table, recover partition, and refresh table 
manually. The table created by hive sync are using HoodieHiveInputFormat, but 
impala read HUDI_PARQUET as regular parquet. Those two are totally different 
and it could be problematic if we use the hive and impala for the same table. 
   Would you try this, create an external impala table pointing to the hudi 
hdfs path, run impala query to recover partitions and refresh table. If this 
still doesn't work, we probably need some help from impala support.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to