This is an automated email from the ASF dual-hosted git repository.
jshao pushed a commit to branch branch-0.8
in repository https://gitbox.apache.org/repos/asf/gravitino.git
The following commit(s) were added to refs/heads/branch-0.8 by this push:
new f025777ddb [#6218] docs: Adding Documentation for
GravitinoPaimonCatalog in Flink Connectors (#6297)
f025777ddb is described below
commit f025777ddbd9ada426b2c57276d70dad3d70f72a
Author: github-actions[bot]
<41898282+github-actions[bot]@users.noreply.github.com>
AuthorDate: Thu Jan 16 17:22:28 2025 +0800
[#6218] docs: Adding Documentation for GravitinoPaimonCatalog in Flink
Connectors (#6297)
### What changes were proposed in this pull request?
close #6218
Co-authored-by: yangyang zhong <[email protected]>
---
docs/flink-connector/flink-catalog-paimon.md | 108 +++++++++++++++++++++++++++
1 file changed, 108 insertions(+)
diff --git a/docs/flink-connector/flink-catalog-paimon.md
b/docs/flink-connector/flink-catalog-paimon.md
new file mode 100644
index 0000000000..87b3451a8a
--- /dev/null
+++ b/docs/flink-connector/flink-catalog-paimon.md
@@ -0,0 +1,108 @@
+---
+title: "Flink connector paimon catalog"
+slug: /flink-connector/flink-catalog-paimon
+keyword: flink connector paimon catalog
+license: "This software is licensed under the Apache License version 2."
+---
+
+This document provides a comprehensive guide on configuring and using Apache
Gravitino Flink connector to access the Paimon catalog managed by the Gravitino
server.
+## Capabilities
+
+### Supported Paimon Table Types
+
+* AppendOnly Table
+
+### Supported Operation Types
+
+Supports most DDL and DML operations in Flink SQL, except such operations:
+
+- Function operations
+- Partition operations
+- View operations
+- Querying UDF
+- `LOAD` clause
+- `UNLOAD` clause
+- `CREATE TABLE LIKE` clause
+- `TRUCATE TABLE` clause
+- `UPDATE` clause
+- `DELETE` clause
+- `CALL` clause
+
+## Requirement
+
+* Paimon 0.8
+
+Higher version like 0.9 or above may also supported but have not been tested
fully.
+
+## Getting Started
+
+### Prerequisites
+
+Place the following JAR files in the lib directory of your Flink installation:
+
+* paimon-flink-1.18-0.8.2.jar
+
+* gravitino-flink-connector-runtime-\${flinkMajorVersion}_$scalaVersion.jar
+
+### SQL Example
+
+```sql
+
+-- Suppose paimon_catalog is the Paimon catalog name managed by Gravitino
+use catalog paimon_catalog;
+-- Execute statement succeed.
+
+show databases;
+-- +---------------------+
+-- | database name |
+-- +---------------------+
+-- | default |
+-- | gravitino_paimon_db |
+-- +---------------------+
+
+SET 'execution.runtime-mode' = 'batch';
+-- [INFO] Execute statement succeed.
+
+SET 'sql-client.execution.result-mode' = 'tableau';
+-- [INFO] Execute statement succeed.
+
+CREATE TABLE paimon_tabla_a (
+ aa BIGINT,
+ bb BIGINT
+);
+
+show tables;
+-- +----------------+
+-- | table name |
+-- +----------------+
+-- | paimon_table_a |
+-- +----------------+
+
+
+select * from paimon_table_a;
+-- Empty set
+
+insert into paimon_table_a(aa,bb) values(1,2);
+-- [INFO] Submitting SQL update statement to the cluster...
+-- [INFO] SQL update statement has been successfully submitted to the cluster:
+-- Job ID: 74c0c678124f7b452daf08c399d0fee2
+
+select * from paimon_table_a;
+-- +----+----+
+-- | aa | bb |
+-- +----+----+
+-- | 1 | 2 |
+-- +----+----+
+-- 1 row in set
+```
+
+## Catalog properties
+
+Gravitino Flink connector will transform below property names which are
defined in catalog properties to Flink Paimon connector configuration.
+
+| Gravitino catalog property name | Flink Paimon connector configuration |
Description
| Since Version |
+|---------------------------------|----------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|------------------|
+| `catalog-backend` | `metastore` |
Catalog backend of Gravitino Paimon catalog. Supports `filesystem`.
| 0.8.0-incubating |
+| `warehouse` | `warehouse` |
Warehouse directory of catalog. `file:///user/hive/warehouse-paimon/` for local
fs, `hdfs://namespace/hdfs/path` for HDFS , `s3://{bucket-name}/path/` for S3
or `oss://{bucket-name}/path` for Aliyun OSS | 0.8.0-incubating |
+
+Gravitino catalog property names with the prefix `flink.bypass.` are passed to
Flink Paimon connector. For example, using `flink.bypass.clients` to pass the
`clients` to the Flink Paimon connector.