imay commented on a change in pull request #3553:
URL: https://github.com/apache/incubator-doris/pull/3553#discussion_r426329395



##########
File path: docs/zh-CN/sql-reference/sql-statements/Data Manipulation/STREAM 
LOAD.md
##########
@@ -127,7 +142,35 @@ under the License.
     9. 导入含有BITMAP列的表,可以是表中的列或者数据中的列用于生成BITMAP列,也可以使用bitmap_empty填充空的Bitmap
         curl --location-trusted -u root -H "columns: k1, k2, v1=to_bitmap(k1), 
v2=bitmap_empty()" -T testData http://host:port/api/testDb/testTbl/_stream_load
 
- 
+    10. 简单模式,导入json数据
+         表结构: 
+           `category` varchar(512) NULL COMMENT "",
+           `author` varchar(512) NULL COMMENT "",
+           `title` varchar(512) NULL COMMENT "",
+           `price` double NULL COMMENT ""
+       json数据格式:
+           {"category":"C++","author":"avc","title":"C++ primer","price":895}
+         导入命令:
+           curl --location-trusted -u root  -H "label:123" -H "format: json" 
-T testData http://host:port/api/testDb/testTbl/_stream_load
+         为了提升吞吐量,支持一次性导入条数据,json数据格式如下:
+           {"RECORDS":[
+               {"category":"C++","author":"avc","title":"C++ 
primer","price":89.5},
+               {"category":"Java","author":"avc","title":"Effective 
Java","price":95},
+               {"category":"Linux","author":"avc","title":"Linux 
kernel","price":195}
+              ]}

Review comment:
       我理解不应该是这样的格式,而应该是下面的格式:
   [
                  {"category":"C++","author":"avc","title":"C++ 
primer","price":89.5},
                  {"category":"Java","author":"avc","title":"Effective 
Java","price":95},
                  {"category":"Linux","author":"avc","title":"Linux 
kernel","price":195}
   ]

##########
File path: docs/zh-CN/sql-reference/sql-statements/Data Manipulation/ROUTINE 
LOAD.md
##########
@@ -309,6 +321,84 @@ under the License.
             "property.ssl.key.password" = "abcdefg",
             "property.client.id" = "my_client_id"
         );
+    4. 简单模式导入json
+        CREATE ROUTINE LOAD example_db.test_json_label_1 ON table1
+        COLUMNS(category,price,author)
+        PROPERTIES
+        (
+        "desired_concurrent_number"="3",
+        "max_batch_interval" = "20",
+        "max_batch_rows" = "300000",
+        "max_batch_size" = "209715200",
+        "strict_mode" = "false",
+        "format" = "json"
+        )
+        FROM KAFKA
+        (
+        "kafka_broker_list" = "broker1:9092,broker2:9092,broker3:9092",
+        "kafka_topic" = "my_topic",
+        "kafka_partitions" = "0,1,2",
+        "kafka_offsets" = "0,0,0"
+        );
+        支持两种json数据格式:
+      1){"category":"a9jadhx","author":"test","price":895}
+        2){
+            "RECORDS":[
+                {"category":"a9jadhx","author":"test","price":895},
+                {"category":"axdfa1","author":"EvelynWaugh","price":1299}
+            ]

Review comment:
       ```suggestion
           2)[
                   {"category":"a9jadhx","author":"test","price":895},
                   {"category":"axdfa1","author":"EvelynWaugh","price":1299}
               ]
   ```

##########
File path: docs/zh-CN/sql-reference/sql-statements/Data Manipulation/STREAM 
LOAD.md
##########
@@ -67,13 +67,28 @@ under the License.
         比如指定导入到p1, p2分区,-H "partitions: p1, p2"
 
         timeout: 指定导入的超时时间。单位秒。默认是 600 秒。可设置范围为 1 秒 ~ 259200 秒。
-        
+
         strict_mode: 用户指定此次导入是否开启严格模式,默认为开启。关闭方式为 -H "strict_mode: false"。
 
         timezone: 指定本次导入所使用的时区。默认为东八区。该参数会影响所有导入涉及的和时区有关的函数结果。
-        
+
         exec_mem_limit: 导入内存限制。默认为 2GB。单位为字节。
 
+        format: 指定导入数据格式,默认是csv,支持json格式。
+
+        jsonpaths: 导入json方式分为:简单模式和精准模式。
+              简单模式:没有设置jsonpaths参数即为简单模式,这种模式下要求json数据是对象类型,例如:
+              {"k1":1, "k2":2, "k3":"hello"},其中k1,k2,k3是列名字。
+
+              精准模式:用于json数据相对复杂,需要通过jsonpaths参数获取相应value。

Review comment:
       感觉精准用在这里不太适合。这里完成的应该是匹配,映射工作吧?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to