pengzhiwei2018 commented on a change in pull request #3387:
URL: https://github.com/apache/hudi/pull/3387#discussion_r682249809



##########
File path: 
hudi-spark-datasource/hudi-spark-common/src/main/scala/org/apache/hudi/DataSourceOptions.scala
##########
@@ -399,6 +400,11 @@ object DataSourceWriteOptions {
     .defaultValue(1000)
     .withDocumentation("The number of partitions one batch when synchronous 
partitions to hive.")
 
+  val HIVE_SYNC_MODE: ConfigProperty[String] = ConfigProperty
+    .key("hoodie.datasource.hive_sync.mode")
+    .noDefaultValue()

Review comment:
       There have some compatibility problem. If set the HIVE_SYNC_MODE a 
default value , the `useJdbc` will not work which will affect old hudi table 
job.
   But I agree to add a Enum for the constant.

##########
File path: 
hudi-spark-datasource/hudi-spark-common/src/main/scala/org/apache/hudi/DataSourceOptions.scala
##########
@@ -347,6 +346,8 @@ object DataSourceWriteOptions {
     .defaultValue("false")
     .withDocumentation("")
 
+  // We should use HIVE_SYNC_MODE instead of this config from 0.9.0

Review comment:
       make sense.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to