nsivabalan commented on a change in pull request #4915:
URL: https://github.com/apache/hudi/pull/4915#discussion_r839010055



##########
File path: 
hudi-spark-datasource/hudi-spark/src/test/scala/org/apache/spark/sql/hudi/TestSqlConf.scala
##########
@@ -85,6 +87,15 @@ class TestSqlConf extends TestHoodieSqlBase with 
BeforeAndAfter {
         s"$tablePath/" + HoodieTableMetaClient.METAFOLDER_NAME,
         HoodieTableConfig.PAYLOAD_CLASS_NAME.defaultValue).getTableType)
 
+      // Manually pass incremental configs to global configs to make sure Hudi 
query is able to load the
+      // global configs
+      DFSPropertiesConfiguration.addToGlobalProps(QUERY_TYPE.key, 
QUERY_TYPE_INCREMENTAL_OPT_VAL)

Review comment:
       @zhedoubushishi : whats the scope of these configs. Is it the full 
spark-sql or spark-shell session ? also, users have to remember to remove them 
if for 2nd query they don't need these configs right? 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to