Hi Everyone,

Just wondering if there is a way to create external tables in hive backed
by parquet files (which stores thrift serialized data) in hdfs without
providing the columnar schema. We have thousands of nested columns and
would be great if there was a way hive could figure out the schema from
reading the parquet file like spark does.

Is it possible to do this in hive 0.12 as well ?

Reply via email to