Hello,

I've been looking for good ways to create and write to Hive tables from
Java code. So far, I've considered the following options:

1. Create Hive table using the JDBC client, write data to HDFS using bare
HDFS operations, and load that data into the Hive table using the JDBC
client. I didn't like this since I'd have to write a lot of code to handle
various file types myself, which I'm guessing has already been done.
2. Use HCatalog.

I didn't like #1 since I'd have to write a lot of code to handle various
file types myself, which I'm guessing has already been done. Using HCatalog
(#2) looks really simple from Pig and MapReduce, but I wasn't able to
figure out how to write to a Hive table outside of a MapReduce job for this.

Any help would be greatly appreciated!

Thanks,
Alvin

Reply via email to