zhedoubushishi commented on a change in pull request #4087:
URL: https://github.com/apache/hudi/pull/4087#discussion_r756421773



##########
File path: website/docs/configurations.md
##########
@@ -15,6 +15,20 @@ This page covers the different ways of configuring your job 
to write/read Hudi t
 - [**Metrics Configs**](#METRICS): These set of configs are used to enable 
monitoring and reporting of keyHudi stats and metrics.
 - [**Record Payload Config**](#RECORD_PAYLOAD): This is the lowest level of 
customization offered by Hudi. Record payloads define how to produce new values 
to upsert based on incoming new record and stored old record. Hudi provides 
default implementations such as OverwriteWithLatestAvroPayload which simply 
update table with the latest/last-written record. This can be overridden to a 
custom class extending HoodieRecordPayload class, on both datasource and 
WriteClient levels.
 
+---
+Except directly passing configurations to Hudi jobs, since 0.10.0, Hudi also 
supports passing configurations through an external configuration file 
`hudi-default.conf` in which each line consists of a key and a value separated 
by whitespace/equal sign. For example:
+```
+hoodie.datasource.hive_sync.mode               jdbc
+hoodie.datasource.hive_sync.jdbcurl            jdbc:hive2://localhost:10000
+hoodie.datasource.hive_sync.support_timestamp  false
+```
+This is a cluster level configuration, all the Hudi jobs running in this 
cluster would share the same configuration.
+The configuration is parsed and evaluated when the Hudi engine processes are 
started. Changes to the configuration file require restarting the relevant 
processes.
+

Review comment:
       Make sense. Removed




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to