This is an automated email from the ASF dual-hosted git repository.

zykkk pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/doris.git


The following commit(s) were added to refs/heads/master by this push:
     new bbbefc4b6f [typo](docs) Capitalize and Rename Title of Files in Data 
Operation-Export (#22457)
bbbefc4b6f is described below

commit bbbefc4b6f4733fb25a99018e8ccca15c844b817
Author: KassieZ <[email protected]>
AuthorDate: Wed Aug 2 21:56:38 2023 +0800

    [typo](docs) Capitalize and Rename Title of Files in Data Operation-Export 
(#22457)
---
 docs/en/docs/data-operate/export/export-manual.md      | 18 +++++++++---------
 ...rt_with_mysql_dump.md => export-with-mysql-dump.md} |  6 +++---
 docs/en/docs/data-operate/export/outfile.md            | 12 ++++++------
 docs/sidebars.json                                     |  2 +-
 ...rt_with_mysql_dump.md => export-with-mysql-dump.md} |  4 ++--
 5 files changed, 21 insertions(+), 21 deletions(-)

diff --git a/docs/en/docs/data-operate/export/export-manual.md 
b/docs/en/docs/data-operate/export/export-manual.md
index ba852fc207..90e65825ac 100644
--- a/docs/en/docs/data-operate/export/export-manual.md
+++ b/docs/en/docs/data-operate/export/export-manual.md
@@ -1,6 +1,6 @@
 ---
 {
-    "title": "Data export",
+    "title": "Export Overview",
     "language": "en"
 }
 ---
@@ -24,7 +24,7 @@ specific language governing permissions and limitations
 under the License.
 -->
 
-# Data export
+# Export Overview
 
 Export is a function provided by Doris to export data. This function can 
export user-specified table or partition data in text format to remote storage 
through Broker process, such as HDFS / Object storage (supports S3 protocol) 
etc.
 
@@ -37,7 +37,7 @@ This document mainly introduces the basic principles, usage, 
best practices and
 * Broker: Doris can manipulate files for remote storage through the Broker 
process.
 * Tablet: Data fragmentation. A table is divided into multiple data fragments.
 
-## Principle
+## Principles
 
 After the user submits an Export job. Doris counts all Tablets involved in 
this job. These tablets are then grouped to generate a special query plan for 
each group. The query plan reads the data on the included tablet and then 
writes the data to the specified path of the remote storage through Broker. It 
can also be directly exported to the remote storage that supports S3 protocol 
through S3 protocol.
 
@@ -75,7 +75,7 @@ The overall mode of dispatch is as follows:
        1. PENDING: FE generates Export Pending Task, sends snapshot command to 
BE, and takes a snapshot of all Tablets involved. And generate multiple query 
plans.
        2. EXPORTING: FE generates Export ExportingTask and starts executing 
the query plan.
 
-### query plan splitting
+### Query Plan Splitting
 
 The Export job generates multiple query plans, each of which scans a portion 
of the Tablet. The number of Tablets scanned by each query plan is specified by 
the FE configuration parameter `export_tablet_num_per_task`, which defaults to 
5. That is, assuming a total of 100 Tablets, 20 query plans will be generated. 
Users can also specify this number by the job attribute `tablet_num_per_task`, 
when submitting a job.
 
@@ -95,7 +95,7 @@ Among them, `c69fcf2b6db5420f-a96b94c1ff8bccef` is the query 
ID of the query pla
 
 When all data is exported, Doris will rename these files to the user-specified 
path.
 
-### Broker parameter
+### Broker Parameter
 
 Export needs to use the Broker process to access remote storage. Different 
brokers need to provide different parameters. For details, please refer to 
[Broker documentation](../../advanced/broker.md)
 
@@ -106,7 +106,7 @@ For detailed usage of Export, please refer to [SHOW 
EXPORT](../../sql-manual/sql
 
 Export's detailed commands can be passed through `HELP EXPORT;` Examples are 
as follows:
 
-### Export to hdfs
+### Export to HDFS
 
 ```sql
 EXPORT TABLE db1.tbl1 
@@ -136,7 +136,7 @@ WITH BROKER "hdfs"
 * `timeout`: homework timeout. Default 2 hours. Unit seconds.
 * `tablet_num_per_task`: The maximum number of fragments allocated per query 
plan. The default is 5.
 
-### Export to object storage (supports S3 protocol)
+### Export to Object Storage (Supports S3 Protocol)
 
 ```sql
 EXPORT TABLE test TO "s3://bucket/path/to/export/dir/" WITH S3  (
@@ -151,7 +151,7 @@ EXPORT TABLE test TO "s3://bucket/path/to/export/dir/" WITH 
S3  (
 - `AWS_ENDPOINT`:Endpoint indicates the access domain name of object storage 
external services.
 - `AWS_REGION`:Region indicates the region where the object storage data 
center is located.
 
-### View export status
+### View Export Status
 
 After submitting a job, the job status can be viewed by querying the   [SHOW 
EXPORT](../../sql-manual/sql-reference/Show-Statements/SHOW-EXPORT.md)  
command. The results are as follows:
 
@@ -194,7 +194,7 @@ FinishTime: 2019-06-25 17:08:34
 * Timeout: Job timeout. The unit is seconds. This time is calculated from 
CreateTime.
 * Error Msg: If there is an error in the job, the cause of the error is shown 
here.
 
-### Cancel export job
+### Cancel Export Job
 
 <version since="dev"></version>
 
diff --git a/docs/en/docs/data-operate/export/export_with_mysql_dump.md 
b/docs/en/docs/data-operate/export/export-with-mysql-dump.md
similarity index 93%
rename from docs/en/docs/data-operate/export/export_with_mysql_dump.md
rename to docs/en/docs/data-operate/export/export-with-mysql-dump.md
index ee2ce765ae..2ce908a1d8 100644
--- a/docs/en/docs/data-operate/export/export_with_mysql_dump.md
+++ b/docs/en/docs/data-operate/export/export-with-mysql-dump.md
@@ -1,6 +1,6 @@
 ---
 {
-"title": "Use mysqldump data to export table structure or data",
+"title": "Export Data or Table Structure with MYSQLDUMP ",
 "language": "en"
 }
 ---
@@ -24,10 +24,10 @@ specific language governing permissions and limitations
 under the License.
 -->
 
-# Use mysqldump data to export table structure or data
+# Export Data or Table Structure with MYSQLDUMP
 Doris has supported exporting data or table structures through the `mysqldump` 
tool after version 0.15
 
-## Example
+## Examples
 ### Export
   1. Export the table1 table in the test database: `mysqldump -h127.0.0.1 
-P9030 -uroot --no-tablespaces --databases test --tables table1`
   2. Export the table1 table structure in the test database: `mysqldump 
-h127.0.0.1 -P9030 -uroot --no-tablespaces --databases test --tables table1 
--no-data`
diff --git a/docs/en/docs/data-operate/export/outfile.md 
b/docs/en/docs/data-operate/export/outfile.md
index 9532377cde..e5c77f751e 100644
--- a/docs/en/docs/data-operate/export/outfile.md
+++ b/docs/en/docs/data-operate/export/outfile.md
@@ -28,7 +28,7 @@ under the License.
 
 This document describes how to use the  [SELECT INTO 
OUTFILE](../../sql-manual/sql-reference/Data-Manipulation-Statements/OUTFILE.md)
  command to export query results.
 
-## Example
+## Examples
 
 ### Export to HDFS
 
@@ -46,7 +46,7 @@ PROPERTIES
 );
 ```
 
-### Export to local file
+### Export to Local Files
 
 When exporting to a local file, you need to configure 
`enable_outfile_to_local=true` in fe.conf first
 
@@ -57,7 +57,7 @@ INTO OUTFILE "file:///home/work/path/result_";
 
 For more usage, see [OUTFILE 
documentation](../../sql-manual/sql-reference/Data-Manipulation-Statements/OUTFILE.md).
 
-## Concurrent export
+## Concurrent Export
 
 By default, the export of the query result set is non-concurrent, that is, a 
single point of export. If the user wants the query result set to be exported 
concurrently, the following conditions need to be met:
 
@@ -67,7 +67,7 @@ By default, the export of the query result set is 
non-concurrent, that is, a sin
 
 If the above three conditions are met, the concurrent export query result set 
can be triggered. Concurrency = ```be_instacne_num * 
parallel_fragment_exec_instance_num```
 
-### How to verify that the result set is exported concurrently
+### How to Verify that the Result Set is Exported Concurrently
 
 After the user enables concurrent export through the session variable setting, 
if you want to verify whether the current query can be exported concurrently, 
you can use the following method.
 
@@ -104,11 +104,11 @@ Planning example for concurrent export:
 +-----------------------------------------------------------------------------+
 ```
 
-## Usage example
+## Usage Examples
 
 For details, please refer to [OUTFILE 
Document](../../sql-manual/sql-reference/Data-Manipulation-Statements/OUTFILE.md).
 
-## Return result
+## Return Results
 
 The command is a synchronization command. The command returns, which means the 
operation is over.
 At the same time, a row of results will be returned to show the exported 
execution result.
diff --git a/docs/sidebars.json b/docs/sidebars.json
index 2e248dd1b0..d3c1daa533 100644
--- a/docs/sidebars.json
+++ b/docs/sidebars.json
@@ -102,7 +102,7 @@
                     "items": [
                         "data-operate/export/export-manual",
                         "data-operate/export/outfile",
-                        "data-operate/export/export_with_mysql_dump"
+                        "data-operate/export/export-with-mysql-dump"
                     ]
                 },
                 {
diff --git a/docs/zh-CN/docs/data-operate/export/export_with_mysql_dump.md 
b/docs/zh-CN/docs/data-operate/export/export-with-mysql-dump.md
similarity index 94%
rename from docs/zh-CN/docs/data-operate/export/export_with_mysql_dump.md
rename to docs/zh-CN/docs/data-operate/export/export-with-mysql-dump.md
index a98bdb1399..65a306c103 100644
--- a/docs/zh-CN/docs/data-operate/export/export_with_mysql_dump.md
+++ b/docs/zh-CN/docs/data-operate/export/export-with-mysql-dump.md
@@ -1,6 +1,6 @@
 ---
 {
-    "title": "Mysqldump导出表结构或数据",
+    "title": "MYSQLDUMP 导出表结构或数据",
     "language": "zh-CN"
 }
 ---
@@ -24,7 +24,7 @@ specific language governing permissions and limitations
 under the License.
 -->
 
-# 使用mysqldump数据导出表结构或者数据
+# 使用 MYSQLDUMP 数据导出表结构或者数据
 Doris 在0.15 之后的版本已经支持通过`mysqldump` 工具导出数据或者表结构
 
 ## 使用示例


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to