This is an automated email from the ASF dual-hosted git repository.
morningman pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/doris-website.git
The following commit(s) were added to refs/heads/master by this push:
new 984252359d [fix](docs) tvf add `resource` property (#688)
984252359d is described below
commit 984252359def988c11029fece0d2aa2621a8391c
Author: Tiewei Fang <[email protected]>
AuthorDate: Fri Jun 21 11:58:55 2024 +0800
[fix](docs) tvf add `resource` property (#688)
related: https://github.com/apache/doris/pull/35139
---
docs/sql-manual/sql-functions/table-functions/hdfs.md | 1 +
docs/sql-manual/sql-functions/table-functions/s3.md | 1 +
.../current/sql-manual/sql-functions/table-functions/hdfs.md | 1 +
.../current/sql-manual/sql-functions/table-functions/s3.md | 1 +
.../version-2.1/sql-manual/sql-functions/table-functions/hdfs.md | 1 +
.../version-2.1/sql-manual/sql-functions/table-functions/s3.md | 1 +
.../version-2.1/sql-manual/sql-functions/table-functions/hdfs.md | 1 +
.../version-2.1/sql-manual/sql-functions/table-functions/s3.md | 1 +
8 files changed, 8 insertions(+)
diff --git a/docs/sql-manual/sql-functions/table-functions/hdfs.md
b/docs/sql-manual/sql-functions/table-functions/hdfs.md
index 3b73028086..7a281e06c6 100644
--- a/docs/sql-manual/sql-functions/table-functions/hdfs.md
+++ b/docs/sql-manual/sql-functions/table-functions/hdfs.md
@@ -91,6 +91,7 @@ File format parameters:
other kinds of parameters:
- `path_partition_keys`: (optional) Specifies the column names carried in the
file path. For example, if the file path is
/path/to/city=beijing/date="2023-07-09", you should fill in
`path_partition_keys="city,date"`. It will automatically read the corresponding
column names and values from the path during load process.
+- `resource`:(optional)Specify the resource name. Hdfs Tvf can use the
existing Hdfs resource to directly access Hdfs. You can refer to the method for
creating an Hdfs resource:
[CREATE-RESOURCE](../../sql-statements/Data-Definition-Statements/Create/CREATE-RESOURCE.md).
This property is supported starting from version 2.1.4 .
### Examples
diff --git a/docs/sql-manual/sql-functions/table-functions/s3.md
b/docs/sql-manual/sql-functions/table-functions/s3.md
index 6027f61141..5d7e25816c 100644
--- a/docs/sql-manual/sql-functions/table-functions/s3.md
+++ b/docs/sql-manual/sql-functions/table-functions/s3.md
@@ -99,6 +99,7 @@ The following 2 parameters are used for loading in csv format
other parameter:
- `path_partition_keys`: (optional) Specifies the column names carried in the
file path. For example, if the file path is
/path/to/city=beijing/date="2023-07-09", you should fill in
`path_partition_keys="city,date"`. It will automatically read the corresponding
column names and values from the path during load process.
+- `resource`:(optional)Specify the resource name. S3 tvf can use the existing
S3 resource to directly access S3. You can refer to the method for creating an
S3 resource:
[CREATE-RESOURCE](../../sql-statements/Data-Definition-Statements/Create/CREATE-RESOURCE.md).
This property is supported starting from version 2.1.4.
### Example
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/sql-manual/sql-functions/table-functions/hdfs.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/sql-manual/sql-functions/table-functions/hdfs.md
index 9e65320f3d..4ea148cf58 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/sql-manual/sql-functions/table-functions/hdfs.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/sql-manual/sql-functions/table-functions/hdfs.md
@@ -91,6 +91,7 @@ hdfs(
其他参数:
-
`path_partition_keys`:(选填)指定文件路径中携带的分区列名,例如/path/to/city=beijing/date="2023-07-09",
则填写`path_partition_keys="city,date"`,将会自动从路径中读取相应列名和列值进行导入。
+- `resource`:(选填)指定resource名,hdfs tvf 可以利用已有的 hdfs resource 来直接访问hdfs。创建 hdfs
resource 的方法可以参照
[CREATE-RESOURCE](../../sql-statements/Data-Definition-Statements/Create/CREATE-RESOURCE.md)。该功能自
2.1.4 版本开始支持。
### Examples
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/sql-manual/sql-functions/table-functions/s3.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/sql-manual/sql-functions/table-functions/s3.md
index b3dbc2e12e..8a91595401 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/sql-manual/sql-functions/table-functions/s3.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/sql-manual/sql-functions/table-functions/s3.md
@@ -99,6 +99,7 @@ S3 tvf中的每一个参数都是一个 `"key"="value"` 对。
其他参数:
-
`path_partition_keys`:(选填)指定文件路径中携带的分区列名,例如/path/to/city=beijing/date="2023-07-09",
则填写`path_partition_keys="city,date"`,将会自动从路径中读取相应列名和列值进行导入。
+- `resource`:(选填)指定resource名,s3 tvf 可以利用已有的 s3 resource 来直接访问s3。创建 s3 resource
的方法可以参照
[CREATE-RESOURCE](../../sql-statements/Data-Definition-Statements/Create/CREATE-RESOURCE.md)。该功能自
2.1.4 版本开始支持。
### Example
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/sql-manual/sql-functions/table-functions/hdfs.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/sql-manual/sql-functions/table-functions/hdfs.md
index 9e65320f3d..4ea148cf58 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/sql-manual/sql-functions/table-functions/hdfs.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/sql-manual/sql-functions/table-functions/hdfs.md
@@ -91,6 +91,7 @@ hdfs(
其他参数:
-
`path_partition_keys`:(选填)指定文件路径中携带的分区列名,例如/path/to/city=beijing/date="2023-07-09",
则填写`path_partition_keys="city,date"`,将会自动从路径中读取相应列名和列值进行导入。
+- `resource`:(选填)指定resource名,hdfs tvf 可以利用已有的 hdfs resource 来直接访问hdfs。创建 hdfs
resource 的方法可以参照
[CREATE-RESOURCE](../../sql-statements/Data-Definition-Statements/Create/CREATE-RESOURCE.md)。该功能自
2.1.4 版本开始支持。
### Examples
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/sql-manual/sql-functions/table-functions/s3.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/sql-manual/sql-functions/table-functions/s3.md
index b3dbc2e12e..e7ab641ffb 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/sql-manual/sql-functions/table-functions/s3.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/sql-manual/sql-functions/table-functions/s3.md
@@ -99,6 +99,7 @@ S3 tvf中的每一个参数都是一个 `"key"="value"` 对。
其他参数:
-
`path_partition_keys`:(选填)指定文件路径中携带的分区列名,例如/path/to/city=beijing/date="2023-07-09",
则填写`path_partition_keys="city,date"`,将会自动从路径中读取相应列名和列值进行导入。
+- `resource`:(选填)指定resource名,hdfs tvf 可以利用已有的 hdfs resource 来直接访问hdfs。创建 hdfs
resource 的方法可以参照
[CREATE-RESOURCE](../../sql-statements/Data-Definition-Statements/Create/CREATE-RESOURCE.md)。该功能自
2.1.4 版本开始支持。
### Example
diff --git
a/versioned_docs/version-2.1/sql-manual/sql-functions/table-functions/hdfs.md
b/versioned_docs/version-2.1/sql-manual/sql-functions/table-functions/hdfs.md
index 3b73028086..7a281e06c6 100644
---
a/versioned_docs/version-2.1/sql-manual/sql-functions/table-functions/hdfs.md
+++
b/versioned_docs/version-2.1/sql-manual/sql-functions/table-functions/hdfs.md
@@ -91,6 +91,7 @@ File format parameters:
other kinds of parameters:
- `path_partition_keys`: (optional) Specifies the column names carried in the
file path. For example, if the file path is
/path/to/city=beijing/date="2023-07-09", you should fill in
`path_partition_keys="city,date"`. It will automatically read the corresponding
column names and values from the path during load process.
+- `resource`:(optional)Specify the resource name. Hdfs Tvf can use the
existing Hdfs resource to directly access Hdfs. You can refer to the method for
creating an Hdfs resource:
[CREATE-RESOURCE](../../sql-statements/Data-Definition-Statements/Create/CREATE-RESOURCE.md).
This property is supported starting from version 2.1.4 .
### Examples
diff --git
a/versioned_docs/version-2.1/sql-manual/sql-functions/table-functions/s3.md
b/versioned_docs/version-2.1/sql-manual/sql-functions/table-functions/s3.md
index 6027f61141..5d7e25816c 100644
--- a/versioned_docs/version-2.1/sql-manual/sql-functions/table-functions/s3.md
+++ b/versioned_docs/version-2.1/sql-manual/sql-functions/table-functions/s3.md
@@ -99,6 +99,7 @@ The following 2 parameters are used for loading in csv format
other parameter:
- `path_partition_keys`: (optional) Specifies the column names carried in the
file path. For example, if the file path is
/path/to/city=beijing/date="2023-07-09", you should fill in
`path_partition_keys="city,date"`. It will automatically read the corresponding
column names and values from the path during load process.
+- `resource`:(optional)Specify the resource name. S3 tvf can use the existing
S3 resource to directly access S3. You can refer to the method for creating an
S3 resource:
[CREATE-RESOURCE](../../sql-statements/Data-Definition-Statements/Create/CREATE-RESOURCE.md).
This property is supported starting from version 2.1.4.
### Example
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]