Re: unable to write partitions with HCatWriter on Cloudera with Sentry and HDFS ACL plugin

2018-09-25 Thread Nathan Bamford
Alex, Thank you for the response. I do see that all of the directories in question are owned by hive:hive, which makes sense, and indeed, writing to them from HCatWriter seems to be no problem. The problem arises in the

Re: Control large file output in dynamic partitioned insert

2018-09-25 Thread Patrick Duin
ok found my own answer via: https://www.ericlin.me/2016/03/hive-dynamic-insert-query-only-uses-1-reducer-out-of-thousands-of-reducers/ This setting gets rid of the last reduce phase in my insert: set hive.optimize.sort.dynamic.partition=false; Now I get as many files in my partition as I have