[
https://issues.apache.org/jira/browse/HIVE-6897?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14592366#comment-14592366
]
Alen Frantz commented on HIVE-6897:
-----------------------------------
I am facing the same issues. Since there is no overwrite feature in HCatalog,
we need to do it outside Pig.
The workaround right now is , to delete the part files inside the table
directory before executing your Pig script. But be careful here, as the rm -r
command is not a good practice.
Many people are facing the same issue, having said that, we actually need to
add these features to be able to take HCatalog to a higher level. This would
help the community.
Really appreciate if these features are added.
Also, I would be glad if I could help in this in anyway. Feel free to get in
touch.
Alen
> Allow overwrite/append to external Hive table (with partitions) via HCatStorer
> ------------------------------------------------------------------------------
>
> Key: HIVE-6897
> URL: https://issues.apache.org/jira/browse/HIVE-6897
> Project: Hive
> Issue Type: Improvement
> Components: HCatalog, HiveServer2
> Affects Versions: 0.12.0
> Reporter: Dip Kharod
>
> I'm using HCatStorer to write to external Hive table with partition from Pig
> and have the following different use cases:
> 1) Need to overwrite (aka, refresh) data into table: Currently I end up doing
> this outside (drop partition and delete HDFS folder) of Pig which is very
> painful and error-prone
> 2) Need to append (aka, add new file) data to the Hive external
> table/partition: Again, I end up doing this outside of Pig by copying file in
> appropriate folder
> It would be very productive for the developers to have both options in
> HCatStorer.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)