This is an automated email from the ASF dual-hosted git repository.
dataroaring pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/doris.git
The following commit(s) were added to refs/heads/master by this push:
new bddf20dd72 [fix](docs) fix the command order of broker load doc.
(#15861)
bddf20dd72 is described below
commit bddf20dd723c91eb9d801507c56ecd55c15e3914
Author: Hong Liu <[email protected]>
AuthorDate: Fri Jan 13 22:55:56 2023 +0800
[fix](docs) fix the command order of broker load doc. (#15861)
Co-authored-by: smallhibiscus <844981280>
---
.../Data-Manipulation-Statements/Load/BROKER-LOAD.md | 12 ++++++------
.../Data-Manipulation-Statements/Load/BROKER-LOAD.md | 12 ++++++------
2 files changed, 12 insertions(+), 12 deletions(-)
diff --git
a/docs/en/docs/sql-manual/sql-reference/Data-Manipulation-Statements/Load/BROKER-LOAD.md
b/docs/en/docs/sql-manual/sql-reference/Data-Manipulation-Statements/Load/BROKER-LOAD.md
index 4c7726d666..4aa1a1aff5 100644
---
a/docs/en/docs/sql-manual/sql-reference/Data-Manipulation-Statements/Load/BROKER-LOAD.md
+++
b/docs/en/docs/sql-manual/sql-reference/Data-Manipulation-Statements/Load/BROKER-LOAD.md
@@ -67,8 +67,8 @@ WITH BROKER broker_name
[FORMAT AS "file_type"]
[(column_list)]
[COLUMNS FROM PATH AS (c1, c2, ...)]
- [PRECEDING FILTER predicate]
[SET (column_mapping)]
+ [PRECEDING FILTER predicate]
[WHERE predicate]
[DELETE ON expr]
[ORDER BY source_sequence]
@@ -109,13 +109,13 @@ WITH BROKER broker_name
Specifies the columns to extract from the import file path.
- - `PRECEDING FILTER predicate`
-
- Pre-filter conditions. The data is first concatenated into raw data rows
in order according to `column list` and `COLUMNS FROM PATH AS`. Then filter
according to the pre-filter conditions. For a detailed introduction to this
part, please refer to the [Column Mapping, Conversion and
Filtering](../../../../../data-operate/import/import-scenes/load-data-convert)
document.
-
- `SET (column_mapping)`
Specifies the conversion function for the column.
+
+ - `PRECEDING FILTER predicate`
+
+ Pre-filter conditions. The data is first concatenated into raw data rows
in order according to `column list` and `COLUMNS FROM PATH AS`. Then filter
according to the pre-filter conditions. For a detailed introduction to this
part, please refer to the [Column Mapping, Conversion and
Filtering](../../../../../data-operate/import/import-scenes/load-data-convert)
document.
- `WHERE predicate`
@@ -313,10 +313,10 @@ WITH BROKER broker_name
DATA INFILE("hdfs://host:port/input/file")
INTO TABLE `my_table`
(k1, k2, k3)
- PRECEDING FILTER k1 = 1
SET (
k2 = k2 + 1
)
+ PRECEDING FILTER k1 = 1
WHERE k1 > k2
)
WITH BROKER hdfs
diff --git
a/docs/zh-CN/docs/sql-manual/sql-reference/Data-Manipulation-Statements/Load/BROKER-LOAD.md
b/docs/zh-CN/docs/sql-manual/sql-reference/Data-Manipulation-Statements/Load/BROKER-LOAD.md
index aed99bff02..79754c2be5 100644
---
a/docs/zh-CN/docs/sql-manual/sql-reference/Data-Manipulation-Statements/Load/BROKER-LOAD.md
+++
b/docs/zh-CN/docs/sql-manual/sql-reference/Data-Manipulation-Statements/Load/BROKER-LOAD.md
@@ -67,8 +67,8 @@ WITH BROKER broker_name
[FORMAT AS "file_type"]
[(column_list)]
[COLUMNS FROM PATH AS (c1, c2, ...)]
- [PRECEDING FILTER predicate]
[SET (column_mapping)]
+ [PRECEDING FILTER predicate]
[WHERE predicate]
[DELETE ON expr]
[ORDER BY source_sequence]
@@ -109,14 +109,14 @@ WITH BROKER broker_name
指定从导入文件路径中抽取的列。
- - `PRECEDING FILTER predicate`
-
- 前置过滤条件。数据首先根据 `column list` 和 `COLUMNS FROM PATH AS`
按顺序拼接成原始数据行。然后按照前置过滤条件进行过滤。关于这部分详细介绍,可以参阅
[列的映射,转换与过滤](../../../../../data-operate/import/import-scenes/load-data-convert)
文档。
-
- `SET (column_mapping)`
指定列的转换函数。
+ - `PRECEDING FILTER predicate`
+
+ 前置过滤条件。数据首先根据 `column list` 和 `COLUMNS FROM PATH AS`
按顺序拼接成原始数据行。然后按照前置过滤条件进行过滤。关于这部分详细介绍,可以参阅
[列的映射,转换与过滤](../../../../../data-operate/import/import-scenes/load-data-convert)
文档。
+
- `WHERE predicate`
根据条件对导入的数据进行过滤。关于这部分详细介绍,可以参阅
[列的映射,转换与过滤](../../../../../data-operate/import/import-scenes/load-data-convert)
文档。
@@ -312,10 +312,10 @@ WITH BROKER broker_name
DATA INFILE("hdfs://host:port/input/file")
INTO TABLE `my_table`
(k1, k2, k3)
- PRECEDING FILTER k1 = 1
SET (
k2 = k2 + 1
)
+ PRECEDING FILTER k1 = 1
WHERE k1 > k2
)
WITH BROKER hdfs
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]