This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch branch-2.1
in repository https://gitbox.apache.org/repos/asf/doris.git


The following commit(s) were added to refs/heads/branch-2.1 by this push:
     new 4aae68f62fd [test](catalog) add upgrade test cases for external 
catalog (#39063) (#39140)
4aae68f62fd is described below

commit 4aae68f62fdc6b9324d287b53012e5d30bd650af
Author: Mingyu Chen <[email protected]>
AuthorDate: Fri Aug 9 15:12:26 2024 +0800

    [test](catalog) add upgrade test cases for external catalog (#39063) 
(#39140)
    
    bp #39063
---
 regression-test/README.md                          |  39 ++++--
 .../data/external_table_p0/upgrade/test.out        |  52 +++++++
 .../suites/external_table_p0/upgrade/load.groovy   | 152 +++++++++++++++++++++
 .../suites/external_table_p0/upgrade/test.groovy   |  83 +++++++++++
 4 files changed, 312 insertions(+), 14 deletions(-)

diff --git a/regression-test/README.md b/regression-test/README.md
index f8404bbd77b..1cc6aca452e 100644
--- a/regression-test/README.md
+++ b/regression-test/README.md
@@ -17,49 +17,58 @@ specific language governing permissions and limitations
 under the License.
 -->
 
-# 新加case注意事项
+# Guide for test cases
 
-## 常规 case
-1. 变量名前要写 def,否则是全局变量,并行跑的 case 的时候可能被其他 case 影响。
+## General Case
+
+1. Write "def" before variable names; otherwise, they will be global variables 
and may be affected by other cases running in parallel.
 
     Problematic code:
     ```
     ret = ***
     ```
+
     Correct code:
     ```
     def ret = ***
     ```
-2. 尽量不要在 case 中 global 的设置 session variable,或者修改集群配置,可能会影响其他 case。
+
+2. Avoid setting global session variables or modifying cluster configurations 
in cases, as it may affect other cases.
 
     Problematic code:
     ```
     sql """set global enable_pipeline_x_engine=true;"""
     ```
+
     Correct code:
     ```
     sql """set enable_pipeline_x_engine=true;"""
     ```
-3. 如果必须要设置 global,或者要改集群配置,可以指定 case 以 nonConcurrent 的方式运行。
 
-    
[示例](https://github.com/apache/doris/blob/master/regression-test/suites/query_p0/sql_functions/cast_function/test_cast_string_to_array.groovy#L18)
-4. case 中涉及时间相关的,最好固定时间,不要用类似 now() 函数这种动态值,避免过一段时间后 case 就跑不过了。
+3. If it is necessary to set global variables or modify cluster 
configurations, specify the case to run in a nonConcurrent manner.
+
+    
[Example](https://github.com/apache/doris/blob/master/regression-test/suites/query_p0/sql_functions/cast_function/test_cast_string_to_array.groovy#L18)
+
+4. For cases involving time-related operations, it is best to use fixed time 
values instead of dynamic values like the `now()` function to prevent cases 
from failing after some time.
 
     Problematic code:
     ```
     sql """select count(*) from table where created < now();"""
     ```
+
     Correct code:
     ```
     sql """select count(*) from table where created < '2023-11-13';"""
     ```
-5. case 中 streamload 后请加上 sync 一下,避免在多 FE 环境中执行不稳定。
+
+5. After streamloading in a case, add a sync to ensure stability when 
executing in a multi-FE environment.
 
     Problematic code:
     ```
     streamLoad { ... }
     sql """select count(*) from table """
     ```
+
     Correct code:
     ```
     streamLoad { ... }
@@ -67,13 +76,15 @@ under the License.
     sql """select count(*) from table """
     ```
 
-6. UDF 的 case,需要把对应的 jar 包拷贝到所有 BE 机器上。
+6. For UDF cases, make sure to copy the corresponding JAR file to all BE 
machines.
+
+    
[Example](https://github.com/apache/doris/blob/master/regression-test/suites/javaudf_p0/test_javaudf_case.groovy#L27)
 
+
+7. Do not create the same table in different cases under the same directory to 
avoid conflicts.
 
-    
[示例](https://github.com/apache/doris/blob/master/regression-test/suites/javaudf_p0/test_javaudf_case.groovy#L27)
+## Compatibility case
 
-7. 同一个目录下不同case间不要创建相同的表,避免互相冲突
+Refers to the resources or rules created on the initial cluster during FE 
testing or upgrade testing, which can still be used normally after the cluster 
restart or upgrade, such as permissions, UDF, etc.
 
-## 兼容性 case
-指重启 FE 测试或升级测试中,在初始集群上创建的资源或规则,在集群重启或升级后也能正常使用,比如权限、UDF等。
-这些 case 需要拆分成两个文件,load.groovy 和 xxxx.groovy,放到一个文件夹中并加上 `restart_fe` 
组标签,[示例](https://github.com/apache/doris/pull/37118)。
+These cases need to be split into two files, `load.groovy` and `xxxx.groovy`, 
placed in a folder, and tagged with the `restart_fe` group label, 
[example](https://github.com/apache/doris/pull/37118).
 
diff --git a/regression-test/data/external_table_p0/upgrade/test.out 
b/regression-test/data/external_table_p0/upgrade/test.out
new file mode 100644
index 00000000000..24271ce5469
--- /dev/null
+++ b/regression-test/data/external_table_p0/upgrade/test.out
@@ -0,0 +1,52 @@
+-- This file is automatically generated. You should know what you did if you 
want to edit this
+-- !hive1 --
+1      
+2      是
+3      III类户
+
+-- !hive2 --
+1      
+2      是
+3      III类户
+
+-- !ice_rest1 --
+-97226733      true    -7154   93      -21192305       72903745700759648       
9.823223        0.20321478918108538     2969-02-03      
1970-01-03T08:00:01.000001      2017-12-03T10:12:55.038194      !gdi%$v 
SkfeaRF9prAcz   AwYZHC7 0000101110010000110     2549.125025     Hefei
+-97834381      false   30790   -51     329082  67450834578680294       8.32709 
1.695344509839929       2969-02-03      1970-01-02T08:00:01.000001      
2017-12-03T10:12:55.038194      !po(qhs OildgTxC6Yp3MW0 gHN     
00111111101000010000000 3469.439442     Shanghai
+-97894492      false   -17752  89      -17810797       24642172641614839       
8.316614        1.3521628855022216      2969-02-03      
1970-01-01T08:00:01.000001      2017-12-02T10:12:55.038194      cixjlv$&#gapyzo 
zpowfZb32re9ScL 1uQ8VE964       0010110000001   5770.442982     Hangzhou
+-98385282      false   12821   -29     -4184928        22450415624137594       
5.0620017       3.754570746934458       1969-09-21      
1970-01-01T08:00:01.000001      2017-12-02T10:12:55.038194      j       
eIkY1NtoauZfTEJd        lij8utE 00011111000     9537.539893     Shanghai
+-98428712      true    -19994  -37     -10403458       -79747158929324986      
9.39804 9.184069888398918       2969-02-03      1970-01-02T08:00:01.000001      
2017-12-01T10:12:55.038194      q)drohaz        2IgsxyLcl       aY9UJjuwdpK6    
00001   7232.956012     Beijing
+-98494260      true    -31980  -1      -17192202       -46554401964424720      
2.624445        3.357848957747339       1969-09-21      
1970-01-01T08:00:01.000001      2017-12-01T10:12:55.038194      #grf    Rv      
nWv5Z9E6        101010  4409.038994     Shanghai
+-98623027      true    -2082   59      -18839692       -65567297943909781      
4.0910926       4.667513572115034       1969-09-21      
1970-01-04T08:00:01.000001      2017-12-02T10:12:55.038194      h!(q    
hGpXWFVwsSYm38t5        FINmCZMUE3dvl5Gz7PAq    00      1726.632887     Shanghai
+-98763612      true    -20598  -116    -890992 -47536120268669383      
7.334266        3.714530270898617       2000-12-31      
1970-01-04T08:00:01.000001      2017-12-03T10:12:55.038194      wq*!md)ak^(lbz  
Ac      ZvRJzGTmkYrVcl5UCi3S    00100100011011010101    9153.323992     Hefei
+-99328658      false   -18802  97      -20497622       -11291399559888626      
4.1584477       6.082803585731483       2000-12-31      
1970-01-03T08:00:01.000001      2017-12-02T10:12:55.038194      !bzk*x@c%(pdqgw 
9yHPEgnQSr7zC5R8GVpf    dv6DcPa 10011010011001000100000101      3688.355941     
Hangzhou
+-99567408      true    -12903  -18     -10473717       71894869370121368       
1.1465179       1.9420032493661743      2000-12-31      
1970-01-01T08:00:01.000001      2017-12-01T10:12:55.038194      fk&xbldo*h      
0PrxtvD kTj0SH  1000101 4939.088709     Hangzhou
+
+-- !ice_rest2 --
+-97226733      true    -7154   93      -21192305       72903745700759648       
9.823223        0.20321478918108538     2969-02-03      
1970-01-03T08:00:01.000001      2017-12-03T10:12:55.038194      !gdi%$v 
SkfeaRF9prAcz   AwYZHC7 0000101110010000110     2549.125025     Hefei
+-97834381      false   30790   -51     329082  67450834578680294       8.32709 
1.695344509839929       2969-02-03      1970-01-02T08:00:01.000001      
2017-12-03T10:12:55.038194      !po(qhs OildgTxC6Yp3MW0 gHN     
00111111101000010000000 3469.439442     Shanghai
+-97894492      false   -17752  89      -17810797       24642172641614839       
8.316614        1.3521628855022216      2969-02-03      
1970-01-01T08:00:01.000001      2017-12-02T10:12:55.038194      cixjlv$&#gapyzo 
zpowfZb32re9ScL 1uQ8VE964       0010110000001   5770.442982     Hangzhou
+-98385282      false   12821   -29     -4184928        22450415624137594       
5.0620017       3.754570746934458       1969-09-21      
1970-01-01T08:00:01.000001      2017-12-02T10:12:55.038194      j       
eIkY1NtoauZfTEJd        lij8utE 00011111000     9537.539893     Shanghai
+-98428712      true    -19994  -37     -10403458       -79747158929324986      
9.39804 9.184069888398918       2969-02-03      1970-01-02T08:00:01.000001      
2017-12-01T10:12:55.038194      q)drohaz        2IgsxyLcl       aY9UJjuwdpK6    
00001   7232.956012     Beijing
+-98494260      true    -31980  -1      -17192202       -46554401964424720      
2.624445        3.357848957747339       1969-09-21      
1970-01-01T08:00:01.000001      2017-12-01T10:12:55.038194      #grf    Rv      
nWv5Z9E6        101010  4409.038994     Shanghai
+-98623027      true    -2082   59      -18839692       -65567297943909781      
4.0910926       4.667513572115034       1969-09-21      
1970-01-04T08:00:01.000001      2017-12-02T10:12:55.038194      h!(q    
hGpXWFVwsSYm38t5        FINmCZMUE3dvl5Gz7PAq    00      1726.632887     Shanghai
+-98763612      true    -20598  -116    -890992 -47536120268669383      
7.334266        3.714530270898617       2000-12-31      
1970-01-04T08:00:01.000001      2017-12-03T10:12:55.038194      wq*!md)ak^(lbz  
Ac      ZvRJzGTmkYrVcl5UCi3S    00100100011011010101    9153.323992     Hefei
+-99328658      false   -18802  97      -20497622       -11291399559888626      
4.1584477       6.082803585731483       2000-12-31      
1970-01-03T08:00:01.000001      2017-12-02T10:12:55.038194      !bzk*x@c%(pdqgw 
9yHPEgnQSr7zC5R8GVpf    dv6DcPa 10011010011001000100000101      3688.355941     
Hangzhou
+-99567408      true    -12903  -18     -10473717       71894869370121368       
1.1465179       1.9420032493661743      2000-12-31      
1970-01-01T08:00:01.000001      2017-12-01T10:12:55.038194      fk&xbldo*h      
0PrxtvD kTj0SH  1000101 4939.088709     Hangzhou
+
+-- !ice_hms1 --
+true   2
+
+-- !paimon_fs1 --
+1      2       3       4       5       6       7       8       9.1     10.1    
11.10   2020-02-02      13str   14varchar       a       true    aaaa    
2023-08-13T09:32:38.530
+10     20      30      40      50      60      70      80      90.1    100.1   
110.10  2020-03-02      130str  140varchar      b       false   bbbb    
2023-08-14T08:32:52.821
+
+-- !paimon_fs2 --
+1      2       3       4       5       6       7       8       9.1     10.1    
11.10   2020-02-02      13str   14varchar       a       true    aaaa    
2023-08-13T09:32:38.530
+10     20      30      40      50      60      70      80      90.1    100.1   
110.10  2020-03-02      130str  140varchar      b       false   bbbb    
2023-08-14T08:32:52.821
+
+-- !mysql1 --
+2023-06-17T10:00       2023-06-17T10:00:01.100 2023-06-17T10:00:02.220 
2023-06-17T10:00:03.333 2023-06-17T10:00:04.444400      
2023-06-17T10:00:05.555550      2023-06-17T10:00:06.666666
+
+-- !mysql2 --
+2023-06-17T10:00       2023-06-17T10:00:01.100 2023-06-17T10:00:02.220 
2023-06-17T10:00:03.333 2023-06-17T10:00:04.444400      
2023-06-17T10:00:05.555550      2023-06-17T10:00:06.666666
+
diff --git a/regression-test/suites/external_table_p0/upgrade/load.groovy 
b/regression-test/suites/external_table_p0/upgrade/load.groovy
new file mode 100644
index 00000000000..6dc068bb6be
--- /dev/null
+++ b/regression-test/suites/external_table_p0/upgrade/load.groovy
@@ -0,0 +1,152 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+
+suite("test_catalog_upgrade_load", 
"p0,external,hive,external_docker,external_docker_hive,restart_fe,upgrade_case")
 {
+
+    // Hive
+    String enabled = context.config.otherConfigs.get("enableHiveTest")
+    if (enabled != null && enabled.equalsIgnoreCase("true")) {
+        String hivePrefix = "hive2"
+        String catalog_name = "test_catalog_upgrade_hive2"
+        String extHiveHmsHost = 
context.config.otherConfigs.get("externalEnvIp")
+        String extHiveHmsPort = context.config.otherConfigs.get(hivePrefix + 
"HmsPort")
+        sql """drop catalog if exists ${catalog_name};"""
+        sql """
+            create catalog if not exists ${catalog_name} properties (
+                'type'='hms',
+                'hive.metastore.uris' = 
'thrift://${extHiveHmsHost}:${extHiveHmsPort}'
+            );
+        """
+        logger.info("catalog " + catalog_name + " created")
+    }
+
+    // Iceberg rest catalog
+    enabled = context.config.otherConfigs.get("enableIcebergTest")
+    if (enabled != null && enabled.equalsIgnoreCase("true")) {
+        String rest_port = 
context.config.otherConfigs.get("iceberg_rest_uri_port")
+        String minio_port = 
context.config.otherConfigs.get("iceberg_minio_port")
+        String externalEnvIp = context.config.otherConfigs.get("externalEnvIp")
+        String catalog_name = "test_catalog_upgrade_iceberg_rest"
+
+        sql """drop catalog if exists ${catalog_name}"""
+        sql """
+        CREATE CATALOG ${catalog_name} PROPERTIES (
+            'type'='iceberg',
+            'iceberg.catalog.type'='rest',
+            'uri' = 'http://${externalEnvIp}:${rest_port}',
+            "s3.access_key" = "admin",
+            "s3.secret_key" = "password",
+            "s3.endpoint" = "http://${externalEnvIp}:${minio_port}";,
+            "s3.region" = "us-east-1"
+        );"""
+    }
+
+    // Iceberg hms catalog
+    enabled = context.config.otherConfigs.get("enableHiveTest")
+    if (enabled != null && enabled.equalsIgnoreCase("true")) {
+        String hivePrefix = "hive2"
+        String hms_port = context.config.otherConfigs.get(hivePrefix + 
"HmsPort")
+        String hdfs_port = context.config.otherConfigs.get(hivePrefix + 
"HdfsPort")
+        String externalEnvIp = context.config.otherConfigs.get("externalEnvIp")
+        String catalog_name = "test_catalog_upgrade_iceberg_hms"
+
+        sql """drop catalog if exists ${catalog_name}"""
+        sql """create catalog if not exists ${catalog_name} properties (
+            'type'='iceberg',
+            'iceberg.catalog.type'='hms',
+            'hive.metastore.uris' = 'thrift://${externalEnvIp}:${hms_port}',
+            'warehouse' = 
'hdfs://${externalEnvIp}:${hdfs_port}/user/iceberg_test/',
+            'fs.defaultFS' = 'hdfs://${externalEnvIp}:${hdfs_port}'
+        );""" 
+    }
+
+    // Paimon filesystem catalog
+    enabled = context.config.otherConfigs.get("enablePaimonTest")
+    if (enabled != null && enabled.equalsIgnoreCase("true")) {
+        String hdfs_port = context.config.otherConfigs.get("hive2HdfsPort")
+        String externalEnvIp = context.config.otherConfigs.get("externalEnvIp")
+        String catalog_name = "test_catalog_upgrade_paimon_fs"
+        sql """drop catalog if exists ${catalog_name}"""
+        sql """create catalog if not exists ${catalog_name} properties (
+            "type" = "paimon",
+            "paimon.catalog.type"="filesystem",
+            "warehouse" = 
"hdfs://${externalEnvIp}:${hdfs_port}/user/doris/paimon1"
+        );""" 
+    }
+
+    // Kerberos hive catalog
+    enabled = context.config.otherConfigs.get("enableKerberosTest")
+    if (enabled != null && enabled.equalsIgnoreCase("true")) {
+        String catalog_name = "test_catalog_upgrade_kerberos_hive"
+        // sql """drop catalog if exists ${catalog_name};"""
+        // sql """
+        //     CREATE CATALOG IF NOT EXISTS ${catalog_name}
+        //     PROPERTIES (
+        //         "type" = "hms",
+        //         "hive.metastore.uris" = "thrift://172.31.71.25:9083",
+        //         "fs.defaultFS" = "hdfs://172.31.71.25:8020",
+        //         "hadoop.kerberos.min.seconds.before.relogin" = "5",
+        //         "hadoop.security.authentication" = "kerberos",
+        //         
"hadoop.kerberos.principal"="hive/[email protected]",
+        //         "hadoop.kerberos.keytab" = 
"/keytabs/hive-presto-master.keytab",
+        //         "hive.metastore.sasl.enabled " = "true",
+        //         "hive.metastore.kerberos.principal" = 
"hive/[email protected]"
+        //     );
+        // """
+
+        // sql """drop catalog if exists other_${catalog_name};"""
+        // sql """
+        //     CREATE CATALOG IF NOT EXISTS other_${catalog_name}
+        //     PROPERTIES (
+        //         "type" = "hms",
+        //         "hive.metastore.uris" = "thrift://172.31.71.26:9083",
+        //         "fs.defaultFS" = "hdfs://172.31.71.26:8020",
+        //         "hadoop.kerberos.min.seconds.before.relogin" = "5",
+        //         "hadoop.security.authentication" = "kerberos",
+        //         
"hadoop.kerberos.principal"="hive/[email protected]",
+        //         "hadoop.kerberos.keytab" = 
"/keytabs/other-hive-presto-master.keytab",
+        //         "hive.metastore.sasl.enabled " = "true",
+        //         "hive.metastore.kerberos.principal" = 
"hive/[email protected]",
+        //         "hadoop.security.auth_to_local" 
="RULE:[2:\$1@\$0](.*@OTHERREALM.COM)s/@.*//
+        //                                           
RULE:[2:\$1@\$0](.*@OTHERLABS.TERADATA.COM)s/@.*//
+        //                                           DEFAULT"
+        //     );
+        // """
+    }
+
+    // Jdbc MySQL catalog
+    enabled = context.config.otherConfigs.get("enableJdbcTest")
+    if (enabled != null && enabled.equalsIgnoreCase("true")) {
+        String mysql_port = context.config.otherConfigs.get("mysql_57_port");
+        String externalEnvIp = context.config.otherConfigs.get("externalEnvIp")
+        String s3_endpoint = getS3Endpoint()
+        String bucket = getS3BucketName()
+        String driver_url = 
"https://${bucket}.${s3_endpoint}/regression/jdbc_driver/mysql-connector-java-8.0.25.jar";
+        // String driver_url = "mysql-connector-java-8.0.25.jar"
+        String catalog_name = "test_catalog_upgrade_jdbc_mysql"
+        sql """drop catalog if exists ${catalog_name} """
+        sql """create catalog if not exists ${catalog_name} properties(
+            "type"="jdbc",
+            "user"="root",
+            "password"="123456",
+            "jdbc_url" = 
"jdbc:mysql://${externalEnvIp}:${mysql_port}/doris_test?useSSL=false&zeroDateTimeBehavior=convertToNull",
+            "driver_url" = "${driver_url}",
+            "driver_class" = "com.mysql.cj.jdbc.Driver"
+        );"""
+    }
+}
+
diff --git a/regression-test/suites/external_table_p0/upgrade/test.groovy 
b/regression-test/suites/external_table_p0/upgrade/test.groovy
new file mode 100644
index 00000000000..a74106ba75f
--- /dev/null
+++ b/regression-test/suites/external_table_p0/upgrade/test.groovy
@@ -0,0 +1,83 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+
+suite("test_catalog_upgrade_test", 
"p0,external,hive,external_docker,external_docker_hive,restart_fe,upgrade_case")
 {
+
+    // Hive
+    String enabled = context.config.otherConfigs.get("enableHiveTest")
+    if (enabled != null && enabled.equalsIgnoreCase("true")) {
+        sql """switch test_catalog_upgrade_hive2"""
+        order_qt_hive1 """select * from multi_catalog.test_chinese_orc limit 
10""";
+        sql """refresh catalog test_catalog_upgrade_hive2"""
+        order_qt_hive2 """select * from multi_catalog.test_chinese_orc limit 
10""";
+    }
+
+    // Iceberg rest catalog
+    enabled = context.config.otherConfigs.get("enableIcebergTest")
+    if (enabled != null && enabled.equalsIgnoreCase("true")) {
+        String rest_port = 
context.config.otherConfigs.get("iceberg_rest_uri_port")
+        sql """switch test_catalog_upgrade_iceberg_rest"""
+        order_qt_ice_rest1 """select * from format_v2.sample_cow_parquet order 
by id limit 10;"""
+        sql """refresh catalog test_catalog_upgrade_iceberg_rest"""
+        order_qt_ice_rest2 """select * from format_v2.sample_cow_parquet order 
by id limit 10;"""
+    }
+
+    // Iceberg hms catalog
+    enabled = context.config.otherConfigs.get("enableHiveTest")
+    if (enabled != null && enabled.equalsIgnoreCase("true")) {
+        sql """switch test_catalog_upgrade_iceberg_hms"""
+        sql """drop database if exists ice_upgrade_db""";
+        sql """create database ice_upgrade_db""";
+        sql """use ice_upgrade_db"""
+        sql """CREATE TABLE unpartitioned_table (
+                  `col1` BOOLEAN COMMENT 'col1',
+                  `col2` INT COMMENT 'col2'
+                )  ENGINE=iceberg
+                PROPERTIES (
+                  'write-format'='parquet'
+                );
+        """
+        sql """insert into unpartitioned_table values(true, 2)"""
+        order_qt_ice_hms1 """select * from unpartitioned_table"""
+    }
+
+    // Paimon filesystem catalog
+    enabled = context.config.otherConfigs.get("enablePaimonTest")
+    if (enabled != null && enabled.equalsIgnoreCase("true")) {
+        sql """switch test_catalog_upgrade_paimon_fs"""
+        order_qt_paimon_fs1 """select * from db1.all_table limit 10""";
+        sql """refresh catalog test_catalog_upgrade_paimon_fs"""
+        order_qt_paimon_fs2 """select * from db1.all_table limit 10""";
+    }
+
+    // Kerberos hive catalog
+    enabled = context.config.otherConfigs.get("enableKerberosTest")
+    if (enabled != null && enabled.equalsIgnoreCase("true")) {
+        String catalog_name = "test_catalog_upgrade_kerberos_hive"
+        // TODO
+    }
+
+    // Jdbc MySQL catalog
+    enabled = context.config.otherConfigs.get("enableJdbcTest")
+    if (enabled != null && enabled.equalsIgnoreCase("true")) {
+        sql """switch test_catalog_upgrade_jdbc_mysql"""
+        order_qt_mysql1 """select * from doris_test.dt""";
+        sql """refresh catalog test_catalog_upgrade_jdbc_mysql"""
+        order_qt_mysql2 """select * from doris_test.dt""";
+    }
+}
+


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to