danny0405 commented on code in PR #9221:
URL: https://github.com/apache/hudi/pull/9221#discussion_r1285152739
##########
hudi-sync/hudi-hive-sync/src/main/java/org/apache/hudi/hive/HiveSyncConfig.java:
##########
@@ -98,8 +98,9 @@ public HiveSyncConfig(Properties props) {
public HiveSyncConfig(Properties props, Configuration hadoopConf) {
super(props, hadoopConf);
- HiveConf hiveConf = hadoopConf instanceof HiveConf
- ? (HiveConf) hadoopConf : new HiveConf(hadoopConf, HiveConf.class);
+ HiveConf hiveConf = new HiveConf();
+ // HiveConf needs to load Hadoop conf to allow instantiation via
AWSGlueClientFactory
+ hiveConf.addResource(hadoopConf);
Review Comment:
> but it's possible that there are other configs/custom configs passed in
via Spark session,
Is this a classical way people pass around hive options with spark?
> An alternative solution would be always pass hadoopConf to HiveConf
constructor
Does it introduce too much overhead then?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]