flyrain commented on a change in pull request #2922:
URL: https://github.com/apache/iceberg/pull/2922#discussion_r682137739



##########
File path: site/docs/spark-configuration.md
##########
@@ -94,6 +94,14 @@ Spark's built-in catalog supports existing v1 and v2 tables 
tracked in a Hive Me
 
 This configuration can use same Hive Metastore for both Iceberg and 
non-Iceberg tables.
 
+### Using catalog specific Hadoop configuration values
+
+Similar to configuring Hadoop properties by using `spark.hadoop.*`, it's 
possible to set per-catalog Hadoop configuration values when using Spark by 
adding the property for the catalog with the prefix 
`spark.sql.catalog.(catalog-name).hadoop.*`. These properties will take 
precedence over values configured globally using `spark.hadoop.*` and will only 
affect Iceberg tables.
+
+```plain
+spark.sql.catalog.hadoop_prod.hadoop.fs.s3a.endpoint = http://aws-local:9000

Review comment:
       Maybe add an example for `hadoop.hive.metastore.uris`, which is one of 
the most common use case here




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to