xushiyan commented on code in PR #7448:
URL: https://github.com/apache/hudi/pull/7448#discussion_r1047691929
##########
hudi-spark-datasource/hudi-spark-common/src/main/scala/org/apache/spark/sql/catalyst/catalog/HoodieCatalogTable.scala:
##########
@@ -216,20 +217,21 @@ class HoodieCatalogTable(val spark: SparkSession, var
table: CatalogTable) exten
private def parseSchemaAndConfigs(): (StructType, Map[String, String]) = {
val globalProps = DFSPropertiesConfiguration.getGlobalProps.asScala.toMap
val globalTableConfigs =
mappingSparkDatasourceConfigsToTableConfigs(globalProps)
- val globalSqlOptions =
HoodieOptionConfig.mappingTableConfigToSqlOption(globalTableConfigs)
+ val globalSqlOptions = mapTableConfigsToSqlOptions(globalTableConfigs)
- val sqlOptions = HoodieOptionConfig.withDefaultSqlOptions(globalSqlOptions
++ catalogProperties)
+ val sqlOptions = withDefaultSqlOptions(globalSqlOptions ++
+ mapDataSourceWriteOptionsToSqlOptions(catalogProperties) ++
catalogProperties)
Review Comment:
when use `saveAsTable()`, table is not yet created, hence no table configs
will be available and `catalogProperties` contains `hoodie.datasource.write.*`
configs, which should be converted to sql options, which will be later stored
as table configs. In `catalogProperties`, existing sql options should take
precedence over the corresponding `hoodie.datasource.write.*`.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]