bhasudha commented on issue #1787:
URL: https://github.com/apache/hudi/issues/1787#issuecomment-657999488


   > @bhasudha in my setup we are not running hive we are just using the 
metadatastore from hive. So what I did I regster a external table like
   > CREATE TABLE test.hoodie_test2(
   > "_hoodie_commit_time" varchar,
   > "_hoodie_commit_seqno" varchar,
   > "_hoodie_record_key" varchar,
   > "_hoodie_partition_path" varchar,
   > "_hoodie_file_name" varchar,
   > "column" varchar,
   > "data_type" varchar,
   > "is_data_type_inferred" varchar,
   > "completeness" double,
   > "approximate_num_distinct_values" bigint,
   > "histogram" array(row(count bigint, ratio double, value varchar)),
   > "mean" double,
   > "maximum" double,
   > "minimum" double,
   > "sum" double,
   > "std_dev" double,
   > approx_percentiles ARRAY )
   > WITH (
   > format='parquet',
   > external_location='s3a://tempwrite/hudi/'
   > )
   > Just wanted to know if this is right way of doing it does it going to 
loose any of the functionality?
   
   @asheeshgarg  I understand you are not running Hive. For Presto queries the 
Hudi table needs to be registered with Hive metastore. Looks like that what you 
are trying to do above, but from Presto. The above might not work because it 
doesn't represent that this table is Hudi formatted. That would take work on 
Presto side to support writes to Hudi. Instead you might want to do Hive sync 
(which is registering the table schema to Hive metastore) - 
https://hudi.apache.org/docs/writing_data.html#syncing-to-hive which sets the 
input and output formats and also the SerDe s as Hudi would require. 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to