openinx commented on a change in pull request #2389:
URL: https://github.com/apache/iceberg/pull/2389#discussion_r603233452



##########
File path: site/docs/flink.md
##########
@@ -312,17 +312,47 @@ INSERT OVERWRITE hive_catalog.default.sample 
PARTITION(data='a') SELECT 6;
 For a partitioned iceberg table, when all the partition columns are set a 
value in `PARTITION` clause, it is inserting into a static partition, otherwise 
if partial partition columns (prefix part of all partition columns) are set a 
value in `PARTITION` clause, it is writing the query result into a dynamic 
partition.
 For an unpartitioned iceberg table, its data will be completely overwritten by 
`INSERT OVERWRITE`.
 
-## Reading with DataStream
+## Iceberg Operation with DataStream API
+### Load Iceberg Catalog
+#### Load Hadoop Catalog
+
+```java
+    Map<String, String> properties = new HashMap<>();
+    properties.put("type", "iceberg");
+    properties.put("catalog-type", "hadoop");
+    properties.put("property-version", "1");
+    properties.put("warehouse", "hdfs://nn:8020/warehouse/path");
+
+    CatalogLoader catalogLoader = CatalogLoader.hadoop(HADOOP_CATALOG, new 
Configuration(), properties);
+```
+
+#### Load Hive Catalog
+
+```java
+    Map<String, String> properties = new HashMap<>();
+    properties.put("type", "iceberg");
+    properties.put("catalog-type", "hive");
+    properties.put("property-version", "1");
+    properties.put("warehouse", "hdfs://nn:8020/warehouse/path");
+    properties.put("uri", "thrift://localhost:9083");
+    properties.put("clients", Integer.toString(2));
+
+    CatalogLoader catalogLoader = CatalogLoader.hive(HIVE_CATALOG, new 
Configuration(), properties);
+```
+
+*Note*: The following are examples of Load Hadoop Catalog.
+
+### Reading with DataStream
 
 Iceberg support streaming or batch read in Java API now.
 
-### Batch Read
+#### Batch Read
 
 This example will read all records from iceberg table and then print to the 
stdout console in flink batch job:
 
 ```java
 StreamExecutionEnvironment env = 
StreamExecutionEnvironment.createLocalEnvironment();
-TableLoader tableLoader = 
TableLoader.fromHadooptable("hdfs://nn:8020/warehouse/path");
+TableLoader tableLoader = 
TableLoader.fromHadoopTable("hdfs://nn:8020/warehouse/path");

Review comment:
       Thanks for the fixing !




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to