beyond1920 commented on code in PR #9485:
URL: https://github.com/apache/hudi/pull/9485#discussion_r1300104947
##########
hudi-spark-datasource/hudi-spark-common/src/main/scala/org/apache/hudi/DefaultSource.scala:
##########
@@ -102,7 +102,7 @@ class DefaultSource extends RelationProvider
)
} else {
Map()
- }) ++
DataSourceOptionsHelper.parametersWithReadDefaults(sqlContext.getAllConfs.filter(k
=> k._1.startsWith("hoodie.")) ++ optParams)
+ }) ++
DataSourceOptionsHelper.parametersWithReadDefaults(extractHoodieConfig(sqlContext.getAllConfs)
++ optParams)
Review Comment:
Please add some tests, such as running the job of `procedure`s, `command`s,
and queries.
##########
hudi-spark-datasource/hudi-spark-common/src/main/scala/org/apache/spark/sql/hudi/HoodieSqlCommonUtils.scala:
##########
@@ -223,6 +223,17 @@ object HoodieSqlCommonUtils extends SparkAdapterSupport {
def isHoodieConfigKey(key: String): Boolean =
key.startsWith("hoodie.") || key ==
DataSourceReadOptions.TIME_TRAVEL_AS_OF_INSTANT.key
+ /**
+ * Extract hoodie config from conf using prefix "spark.hoodie." and
"hoodie.".
+ */
+ def extractHoodieConfig(conf: Map[String, String]): Map[String, String] = {
Review Comment:
Should `ProvidesHoodieConfig#combineOptions` call this new method to cover
the code path of `Command`s (`AlterHoodieTableDropPartitionCommand`,
`InsertIntoHoodieTableCommand` ..), too?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]