This is an automated email from the ASF dual-hosted git repository.
jiafengzheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/doris.git
The following commit(s) were added to refs/heads/master by this push:
new 42960ffd08 [typo](docs)fix docs format (#16279)
42960ffd08 is described below
commit 42960ffd085cea8e21de14f3796d21ef86018c19
Author: lsy3993 <[email protected]>
AuthorDate: Thu Feb 2 14:13:17 2023 +0800
[typo](docs)fix docs format (#16279)
---
.../data-operate/import/import-way/load-json-format.md | 4 ++++
.../sql-manual/sql-functions/date-time-functions/week.md | 1 +
.../Data-Manipulation-Statements/Load/STREAM-LOAD.md | 14 --------------
.../data-operate/import/import-way/load-json-format.md | 4 ++++
.../sql-manual/sql-functions/date-time-functions/week.md | 1 +
.../Data-Manipulation-Statements/Load/STREAM-LOAD.md | 7 -------
6 files changed, 10 insertions(+), 21 deletions(-)
diff --git a/docs/en/docs/data-operate/import/import-way/load-json-format.md
b/docs/en/docs/data-operate/import/import-way/load-json-format.md
index 7fdb02dea3..4cf5f765c9 100644
--- a/docs/en/docs/data-operate/import/import-way/load-json-format.md
+++ b/docs/en/docs/data-operate/import/import-way/load-json-format.md
@@ -524,6 +524,7 @@ curl --location-trusted -u root: -H
"max_filter_ration:0.01" -H "format:json" -
```
Import result:
+```
MySQL > select * from array_test_decimal;
+------+----------------------------------+
| k1 | k2 |
@@ -531,6 +532,7 @@ MySQL > select * from array_test_decimal;
| 39 | [-818.2173181] |
| 40 | [100000000000000000.001111111] |
+------+----------------------------------+
+```
```json
@@ -542,12 +544,14 @@ curl --location-trusted -u root: -H
"max_filter_ration:0.01" -H "format:json" -
```
Import result:
+```
MySQL > select * from array_test_largeint;
+------+------------------------------------------------------------------------------------+
| k1 | k2
|
+------+------------------------------------------------------------------------------------+
| 999 | [76959836937749932879763573681792701709,
26017042825937891692910431521038521227] |
+------+------------------------------------------------------------------------------------+
+```
### Routine Load
diff --git a/docs/en/docs/sql-manual/sql-functions/date-time-functions/week.md
b/docs/en/docs/sql-manual/sql-functions/date-time-functions/week.md
index 2943e45f85..6a64b2fc7d 100644
--- a/docs/en/docs/sql-manual/sql-functions/date-time-functions/week.md
+++ b/docs/en/docs/sql-manual/sql-functions/date-time-functions/week.md
@@ -33,6 +33,7 @@ under the License.
Returns the week number for date.The value of the mode argument defaults to 0.
The following table describes how the mode argument works.
+
|Mode |First day of week |Range |Week 1 is the first week … |
|:----|:-----------------|:------|:-----------------------------|
|0 |Sunday |0-53 |with a Sunday in this year |
diff --git
a/docs/en/docs/sql-manual/sql-reference/Data-Manipulation-Statements/Load/STREAM-LOAD.md
b/docs/en/docs/sql-manual/sql-reference/Data-Manipulation-Statements/Load/STREAM-LOAD.md
index f01ce5126b..042f0b21a8 100644
---
a/docs/en/docs/sql-manual/sql-reference/Data-Manipulation-Statements/Load/STREAM-LOAD.md
+++
b/docs/en/docs/sql-manual/sql-reference/Data-Manipulation-Statements/Load/STREAM-LOAD.md
@@ -194,50 +194,36 @@ ERRORS:
````
2. Import the data in the local file 'testData' into the table 'testTbl' in
the database 'testDb', use Label for deduplication, and only import data whose
k1 is equal to 20180601
-
-
````
curl --location-trusted -u root -H "label:123" -H "where: k1=20180601"
-T testData http://host:port/api/testDb/testTbl/_stream_load
````
3. Import the data in the local file 'testData' into the table 'testTbl' in
the database 'testDb', allowing a 20% error rate (the user is in the
defalut_cluster)
-
-
````
curl --location-trusted -u root -H "label:123" -H
"max_filter_ratio:0.2" -T testData
http://host:port/api/testDb/testTbl/_stream_load
````
4. Import the data in the local file 'testData' into the table 'testTbl' in
the database 'testDb', allow a 20% error rate, and specify the column name of
the file (the user is in the defalut_cluster)
-
-
````
curl --location-trusted -u root -H "label:123" -H
"max_filter_ratio:0.2" -H "columns: k2, k1, v1" -T testData
http://host:port/api/testDb/testTbl /_stream_load
````
5. Import the data in the local file 'testData' into the p1, p2 partitions of
the table 'testTbl' in the database 'testDb', allowing a 20% error rate.
-
-
````
curl --location-trusted -u root -H "label:123" -H
"max_filter_ratio:0.2" -H "partitions: p1, p2" -T testData
http://host:port/api/testDb/testTbl/_stream_load
````
6. Import using streaming (user is in defalut_cluster)
-
-
````
seq 1 10 | awk '{OFS="\t"}{print $1, $1 * 10}' | curl
--location-trusted -u root -T - http://host:port/api/testDb/testTbl/
_stream_load
````
7. Import a table containing HLL columns, which can be columns in the table or
columns in the data to generate HLL columns, or use hll_empty to supplement
columns that are not in the data
-
-
````
curl --location-trusted -u root -H "columns: k1, k2, v1=hll_hash(k1),
v2=hll_empty()" -T testData http://host:port/api/testDb/testTbl/_stream_load
````
8. Import data for strict mode filtering and set the time zone to
Africa/Abidjan
-
-
````
curl --location-trusted -u root -H "strict_mode: true" -H "timezone:
Africa/Abidjan" -T testData http://host:port/api/testDb/testTbl/_stream_load
````
diff --git a/docs/zh-CN/docs/data-operate/import/import-way/load-json-format.md
b/docs/zh-CN/docs/data-operate/import/import-way/load-json-format.md
index fd7a91953b..605f36790f 100644
--- a/docs/zh-CN/docs/data-operate/import/import-way/load-json-format.md
+++ b/docs/zh-CN/docs/data-operate/import/import-way/load-json-format.md
@@ -526,6 +526,7 @@ curl --location-trusted -u root: -H
"max_filter_ration:0.01" -H "format:json" -
```
导入结果:
+```
MySQL > select * from array_test_decimal;
+------+----------------------------------+
| k1 | k2 |
@@ -533,6 +534,7 @@ MySQL > select * from array_test_decimal;
| 39 | [-818.2173181] |
| 40 | [100000000000000000.001111111] |
+------+----------------------------------+
+```
```json
@@ -544,12 +546,14 @@ curl --location-trusted -u root: -H
"max_filter_ration:0.01" -H "format:json" -
```
导入结果:
+```
MySQL > select * from array_test_largeint;
+------+------------------------------------------------------------------------------------+
| k1 | k2
|
+------+------------------------------------------------------------------------------------+
| 999 | [76959836937749932879763573681792701709,
26017042825937891692910431521038521227] |
+------+------------------------------------------------------------------------------------+
+```
### Routine Load
diff --git
a/docs/zh-CN/docs/sql-manual/sql-functions/date-time-functions/week.md
b/docs/zh-CN/docs/sql-manual/sql-functions/date-time-functions/week.md
index 4397cf1f64..9a10026905 100644
--- a/docs/zh-CN/docs/sql-manual/sql-functions/date-time-functions/week.md
+++ b/docs/zh-CN/docs/sql-manual/sql-functions/date-time-functions/week.md
@@ -33,6 +33,7 @@ under the License.
返回指定日期的星期数。mode的值默认为0。
参数mode的作用参见下面的表格:
+
|Mode |星期的第一天 |星期数的范围 |第一个星期的定义 |
|:---|:-------------|:-----------|:--------------------------------------------|
|0 |星期日 |0-53 |这一年中的第一个星期日所在的星期 |
diff --git
a/docs/zh-CN/docs/sql-manual/sql-reference/Data-Manipulation-Statements/Load/STREAM-LOAD.md
b/docs/zh-CN/docs/sql-manual/sql-reference/Data-Manipulation-Statements/Load/STREAM-LOAD.md
index d383705d75..c6d1f4188d 100644
---
a/docs/zh-CN/docs/sql-manual/sql-reference/Data-Manipulation-Statements/Load/STREAM-LOAD.md
+++
b/docs/zh-CN/docs/sql-manual/sql-reference/Data-Manipulation-Statements/Load/STREAM-LOAD.md
@@ -191,43 +191,36 @@ ERRORS:
```
2. 将本地文件'testData'中的数据导入到数据库'testDb'中'testTbl'的表,使用Label用于去重,
并且只导入k1等于20180601的数据
-
```
curl --location-trusted -u root -H "label:123" -H "where: k1=20180601"
-T testData http://host:port/api/testDb/testTbl/_stream_load
```
3. 将本地文件'testData'中的数据导入到数据库'testDb'中'testTbl'的表,
允许20%的错误率(用户是defalut_cluster中的)
-
```
curl --location-trusted -u root -H "label:123" -H
"max_filter_ratio:0.2" -T testData
http://host:port/api/testDb/testTbl/_stream_load
```
4. 将本地文件'testData'中的数据导入到数据库'testDb'中'testTbl'的表,
允许20%的错误率,并且指定文件的列名(用户是defalut_cluster中的)
-
```
curl --location-trusted -u root -H "label:123" -H
"max_filter_ratio:0.2" -H "columns: k2, k1, v1" -T testData
http://host:port/api/testDb/testTbl/_stream_load
```
5. 将本地文件'testData'中的数据导入到数据库'testDb'中'testTbl'的表中的p1, p2分区, 允许20%的错误率。
-
```
curl --location-trusted -u root -H "label:123" -H
"max_filter_ratio:0.2" -H "partitions: p1, p2" -T testData
http://host:port/api/testDb/testTbl/_stream_load
```
6. 使用streaming方式导入(用户是defalut_cluster中的)
-
```
seq 1 10 | awk '{OFS="\t"}{print $1, $1 * 10}' | curl
--location-trusted -u root -T - http://host:port/api/testDb/testTbl/_stream_load
```
7. 导入含有HLL列的表,可以是表中的列或者数据中的列用于生成HLL列,也可使用hll_empty补充数据中没有的列
-
```
curl --location-trusted -u root -H "columns: k1, k2, v1=hll_hash(k1),
v2=hll_empty()" -T testData http://host:port/api/testDb/testTbl/_stream_load
```
8. 导入数据进行严格模式过滤,并设置时区为 Africa/Abidjan
-
```
curl --location-trusted -u root -H "strict_mode: true" -H "timezone:
Africa/Abidjan" -T testData http://host:port/api/testDb/testTbl/_stream_load
```
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]