This is an automated email from the ASF dual-hosted git repository.

dataroaring pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/doris-website.git


The following commit(s) were added to refs/heads/master by this push:
     new 27c348c0af8 [docs] Fix dead links in import docs (#3464)
27c348c0af8 is described below

commit 27c348c0af86ffd5b68933ffeac21182c5f77f90
Author: Yongqiang YANG <[email protected]>
AuthorDate: Sat Mar 14 05:47:39 2026 -0700

    [docs] Fix dead links in import docs (#3464)
    
    ## Summary
    - replace broken external Routine Load and Stream Load links with stable
    internal docs links
    - fix the invalid Time Zone reference in the BROKER LOAD doc to point to
    the correct Doris page
---
 docs/ecosystem/automq-load.md                                         | 2 +-
 docs/ecosystem/datax.md                                               | 3 +--
 .../sql-statements/data-modification/load-and-export/BROKER-LOAD.md   | 4 ++--
 3 files changed, 4 insertions(+), 5 deletions(-)

diff --git a/docs/ecosystem/automq-load.md b/docs/ecosystem/automq-load.md
index 0ddff7e0a63..4fd9bfc6942 100644
--- a/docs/ecosystem/automq-load.md
+++ b/docs/ecosystem/automq-load.md
@@ -9,7 +9,7 @@
 [AutoMQ](https://github.com/AutoMQ/automq) is a cloud-native fork of Kafka by 
separating storage to object storage like S3. It remains 100% compatible with 
Apache Kafka® while offering users up to a 10x cost-effective and 100x 
elasticity . Through its innovative shared storage architecture, it achieves 
capabilities such as reassign partitions in seconds, self-balancing and auto 
scaling in seconds while ensuring high throughput and low latency.
 ![AutoMQ Storage Architecture](/images/automq/automq_storage_architecture.png)
 
-This article will explain how to use Apache Doris Routine Load to import data 
from AutoMQ into Doris. For more details on Routine Load, please refer to the 
[Routine 
Load](https://doris.apache.org/docs/data-operate/import/routine-load-manual/) 
document.
+This article will explain how to use Apache Doris Routine Load to import data 
from AutoMQ into Doris. For more details on Routine Load, please refer to the 
[Routine Load](../data-operate/import/import-way/routine-load-manual.md) 
document.
 
 ## Environment Preparation
 ### Prepare Apache Doris and Test Data
diff --git a/docs/ecosystem/datax.md b/docs/ecosystem/datax.md
index 3c89ce7f148..bb0d883dc8e 100644
--- a/docs/ecosystem/datax.md
+++ b/docs/ecosystem/datax.md
@@ -123,7 +123,7 @@ Download the [source 
code](https://github.com/apache/doris/tree/master/extension
 
 * **loadProps**
 
-  - Description: The request parameter of StreamLoad. For details, refer to 
the StreamLoad introduction page. [Stream load - Apache 
Doris](https://doris.apache.org/docs/data-operate/import/import-way/stream-load-manual)
+  - Description: The request parameter of StreamLoad. For details, refer to 
the StreamLoad introduction page. [Stream load - Apache 
Doris](../data-operate/import/import-way/stream-load-manual.md)
 
     This includes the imported data format: format, etc. The imported data 
format defaults to csv, which supports JSON. For details, please refer to the 
type conversion section below, or refer to the official information of Stream 
load above.
 
@@ -325,4 +325,3 @@ Total Records Read                     : 2
 Total Read/Write Failures              : 0
 
 ```
-
diff --git 
a/docs/sql-manual/sql-statements/data-modification/load-and-export/BROKER-LOAD.md
 
b/docs/sql-manual/sql-statements/data-modification/load-and-export/BROKER-LOAD.md
index 0949cd9f219..a64bcefb884 100644
--- 
a/docs/sql-manual/sql-statements/data-modification/load-and-export/BROKER-LOAD.md
+++ 
b/docs/sql-manual/sql-statements/data-modification/load-and-export/BROKER-LOAD.md
@@ -135,7 +135,7 @@ WITH BROKER "<broker_name>"
 | exec_mem_limit | Import memory limit, with a default of 2GB and the unit in 
bytes. |
 | strict_mode | Whether to impose strict restrictions on the data, with a 
default of `false`. |
 | partial_columns | A boolean type. When set to `true`, it indicates using 
partial - column updates, with a default value of `false`. It can only be set 
when the table model is Unique and uses Merge on Write. |
-| timezone | Specifies the time zone, which affects some functions affected by 
the time zone, such as `strftime`, `alignment_timestamp`, `from_unixtime`, etc. 
For details, please refer to the [Time Zone](https://chatgpt.com/advanced/time 
- zone.md) documentation. If not specified, "Asia/Shanghai" will be used. |
+| timezone | Specifies the time zone, which affects some functions affected by 
the time zone, such as `strftime`, `alignment_timestamp`, `from_unixtime`, etc. 
For details, please refer to the [Time 
Zone](../../../../admin-manual/cluster-management/time-zone.md) documentation. 
If not specified, "Asia/Shanghai" will be used. |
 | load_parallelism | Import concurrency. The default is 1. Increasing the 
import concurrency will start multiple execution plans to execute the import 
task simultaneously, speeding up the import process. |
 | send_batch_parallelism | Sets the parallelism for sending batch data. If the 
value of the parallelism exceeds `max_send_batch_parallelism_per_job` in the BE 
configuration, the value of `max_send_batch_parallelism_per_job` will be used. |
 | load_to_single_tablet | A boolean type. When set to `true`, it indicates 
supporting importing data into a single tablet of the corresponding partition, 
with a default value of `false`. The number of tasks in the job depends on the 
overall concurrency and can only be set when importing an OLAP table with a 
random bucket. |
@@ -152,4 +152,4 @@ Users executing this SQL command must have at least the 
following permissions:
 
 ## Examples
 
-For complete examples covering S3, HDFS, JSON format, Merge mode, path-based 
partition extraction, and more, refer to [Broker 
Load](../../../../data-operate/import/import-way/broker-load-manual.md) in the 
Data Import guide.
\ No newline at end of file
+For complete examples covering S3, HDFS, JSON format, Merge mode, path-based 
partition extraction, and more, refer to [Broker 
Load](../../../../data-operate/import/import-way/broker-load-manual.md) in the 
Data Import guide.


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to