This is an automated email from the ASF dual-hosted git repository.
yiguolei pushed a commit to branch branch-4.0
in repository https://gitbox.apache.org/repos/asf/doris.git
The following commit(s) were added to refs/heads/branch-4.0 by this push:
new ab21bc7e73f branch-4.0: [fix](paimon) Compat HMS kerberos props in
getHadoopProperties and cover Paimon HMS+OSS case (#61155)
ab21bc7e73f is described below
commit ab21bc7e73fd3b4ad25207b5026673e1969ffab1
Author: Calvin Kirs <[email protected]>
AuthorDate: Tue Mar 10 13:05:59 2026 +0800
branch-4.0: [fix](paimon) Compat HMS kerberos props in getHadoopProperties
and cover Paimon HMS+OSS case (#61155)
…
PaimonSysTableJniScanner / related JNI paths currently only consume
hadoopProps.
When a catalog uses the new HMS kerberos parameters:
- hive.metastore.authentication.type
- hive.metastore.client.principal
- hive.metastore.client.keytab
but does not explicitly configure HDFS kerberos parameters, JNI cannot
recognize the kerberos identity from getHadoopProperties().
This is especially visible in the HMS kerberos + OSS storage scenario:
HMS access should use kerberos, while storage access should still use
OSS credentials.
## What Changed
### 1. Temporary compatibility in CatalogProperty#getHadoopProperties()
When catalog properties contain:
- hive.metastore.authentication.type=kerberos
- hive.metastore.client.principal
- hive.metastore.client.keytab
getHadoopProperties() now injects canonical Hadoop kerberos keys:
- hadoop.security.authentication=kerberos
- hadoop.kerberos.principal=<hive.metastore.client.principal>
- hadoop.kerberos.keytab=<hive.metastore.client.keytab>
If configured, it also passes through:
- hadoop.security.auth_to_local
This is a short-term compatibility fix so JNI paths can reuse HMS
kerberos identity even when only HMS-side kerberos properties are
provided.
### What problem does this PR solve?
Issue Number: close #xxx
Related PR: #xxx
Problem Summary:
### Release note
None
### Check List (For Author)
- Test <!-- At least one of them must be included. -->
- [ ] Regression test
- [ ] Unit Test
- [ ] Manual test (add detailed scripts or steps below)
- [ ] No need to test or manual test. Explain why:
- [ ] This is a refactor/code format and no logic has been changed.
- [ ] Previous test can cover this change.
- [ ] No code files have been changed.
- [ ] Other reason <!-- Add your reason? -->
- Behavior changed:
- [ ] No.
- [ ] Yes. <!-- Explain the behavior change -->
- Does this need documentation?
- [ ] No.
- [ ] Yes. <!-- Add document PR link here. eg:
https://github.com/apache/doris-website/pull/1214 -->
### Check List (For Reviewer who merge this PR)
- [ ] Confirm the release note
- [ ] Confirm test cases
- [ ] Confirm document
- [ ] Add branch pick label <!-- Add branch pick label that this PR
should merge into -->
---
docker/thirdparties/custom_settings.env | 1 +
.../kerberos/entrypoint-hive-master.sh | 4 ++--
.../docker-compose/kerberos/hadoop-hive.env.tpl | 3 ++-
.../docker-compose/kerberos/kerberos1_settings.env | 3 ++-
.../docker-compose/kerberos/kerberos2_settings.env | 3 ++-
.../kerberos/sql/create_paimon_hive_table.hql | 10 +++++++++-
.../apache/doris/datasource/CatalogProperty.java | 23 ++++++++++++++++++++++
.../paimon/test_paimon_hms_catalog.out | 14 +++++++++++++
.../paimon/test_paimon_hms_catalog.groovy | 1 +
9 files changed, 56 insertions(+), 6 deletions(-)
diff --git a/docker/thirdparties/custom_settings.env
b/docker/thirdparties/custom_settings.env
index b18f94fd4c1..97545885d76 100644
--- a/docker/thirdparties/custom_settings.env
+++ b/docker/thirdparties/custom_settings.env
@@ -23,3 +23,4 @@
CONTAINER_UID="doris--"
export s3Endpoint="oss-cn-hongkong.aliyuncs.com"
export s3BucketName="doris-regression-hk"
+export OSSBucket="doris-regression-bj"
diff --git
a/docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
b/docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
index b19178aefb0..4a16d1e16a1 100755
--- a/docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
+++ b/docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -90,7 +90,7 @@ hive -f /usr/local/sql/create_kerberos_hive_table.sql
if [[ ${enablePaimonHms} == "true" ]]; then
echo "Creating Paimon HMS catalog and table"
hadoop fs -put /tmp/paimon_data/* /user/hive/warehouse/
- hive -f /usr/local/sql/create_paimon_hive_table.hql
+ hive --hiveconf oss_bucket="${OSSBucket}" -f
/usr/local/sql/create_paimon_hive_table.hql
fi
-exec_success_hook
\ No newline at end of file
+exec_success_hook
diff --git a/docker/thirdparties/docker-compose/kerberos/hadoop-hive.env.tpl
b/docker/thirdparties/docker-compose/kerberos/hadoop-hive.env.tpl
index 9253b5aee09..9f36428c433 100644
--- a/docker/thirdparties/docker-compose/kerberos/hadoop-hive.env.tpl
+++ b/docker/thirdparties/docker-compose/kerberos/hadoop-hive.env.tpl
@@ -84,6 +84,7 @@
HIVE_SITE_CONF_fs_oss_impl=org.apache.hadoop.fs.aliyun.oss.AliyunOSSFileSystem
HIVE_SITE_CONF_fs_oss_accessKeyId=${OSSAk}
HIVE_SITE_CONF_fs_oss_accessKeySecret=${OSSSk}
HIVE_SITE_CONF_fs_oss_endpoint=${OSSEndpoint}
+OSSBucket=${OSSBucket}
HIVE_SITE_CONF_fs_AbstractFileSystem_gs_impl=com.google.cloud.hadoop.fs.gcs.GoogleHadoopFS
HIVE_SITE_CONF_fs_gs_project_id=${GCSProjectId}
HIVE_SITE_CONF_google_cloud_auth_service_account_enable=true
@@ -91,4 +92,4 @@
HIVE_SITE_CONF_fs_gs_auth_service_account_email=${GCSAccountEmail}
HIVE_SITE_CONF_fs_gs_auth_service_account_private_key_id=${GCSAccountPrivateKeyId}
HIVE_SITE_CONF_fs_gs_auth_service_account_private_key=${GCSAccountPrivateKey}
HIVE_SITE_CONF_fs_gs_proxy_address=${GCSProxyAddress}
-enablePaimonHms=${enablePaimonHms}
\ No newline at end of file
+enablePaimonHms=${enablePaimonHms}
diff --git a/docker/thirdparties/docker-compose/kerberos/kerberos1_settings.env
b/docker/thirdparties/docker-compose/kerberos/kerberos1_settings.env
index 44f2352cf16..4d71155cbb2 100644
--- a/docker/thirdparties/docker-compose/kerberos/kerberos1_settings.env
+++ b/docker/thirdparties/docker-compose/kerberos/kerberos1_settings.env
@@ -55,6 +55,7 @@ export AWSEndpoint="s3.ap-northeast-1.amazonaws.com"
export OSSAk="*****************"
export OSSSk="*****************"
export OSSEndpoint="oss-cn-beijing.aliyuncs.com"
+export OSSBucket="${OSSBucket:-doris-regression-bj}"
export OBSAk="*****************"
export OBSSk="*****************"
export OBSEndpoint="obs.cn-north-4.myhuaweicloud.com"
@@ -65,4 +66,4 @@ export GCSProjectId=""
export GCSAccountEmail=""
export GCSAccountPrivateKeyId=""
export GCSAccountPrivateKey=""
-export GCSProxyAddress=""
\ No newline at end of file
+export GCSProxyAddress=""
diff --git a/docker/thirdparties/docker-compose/kerberos/kerberos2_settings.env
b/docker/thirdparties/docker-compose/kerberos/kerberos2_settings.env
index 49099e0f441..e26a3297a75 100644
--- a/docker/thirdparties/docker-compose/kerberos/kerberos2_settings.env
+++ b/docker/thirdparties/docker-compose/kerberos/kerberos2_settings.env
@@ -51,6 +51,7 @@ export AWSEndpoint="s3.ap-northeast-1.amazonaws.com"
export OSSAk="*****************"
export OSSSk="*****************"
export OSSEndpoint="oss-cn-beijing.aliyuncs.com"
+export OSSBucket="${OSSBucket:-doris-regression-bj}"
export OBSAk="*****************"
export OBSSk="*****************"
export OBSEndpoint="obs.cn-north-4.myhuaweicloud.com"
@@ -61,4 +62,4 @@ export GCSProjectId=""
export GCSAccountEmail=""
export GCSAccountPrivateKeyId=""
export GCSAccountPrivateKey=""
-export GCSProxyAddress=""
\ No newline at end of file
+export GCSProxyAddress=""
diff --git
a/docker/thirdparties/docker-compose/kerberos/sql/create_paimon_hive_table.hql
b/docker/thirdparties/docker-compose/kerberos/sql/create_paimon_hive_table.hql
index 65f463fef22..70a86aacd26 100644
---
a/docker/thirdparties/docker-compose/kerberos/sql/create_paimon_hive_table.hql
+++
b/docker/thirdparties/docker-compose/kerberos/sql/create_paimon_hive_table.hql
@@ -4,4 +4,12 @@ USE hdfs_db;
CREATE EXTERNAL TABLE external_test_table
STORED BY 'org.apache.paimon.hive.PaimonStorageHandler'
-LOCATION 'hdfs:///user/hive/warehouse/hdfs_db.db/external_test_table';
\ No newline at end of file
+LOCATION 'hdfs:///user/hive/warehouse/hdfs_db.db/external_test_table';
+
+CREATE DATABASE IF NOT EXISTS ali_db;
+
+USE ali_db;
+
+CREATE EXTERNAL TABLE external_test_table
+ STORED BY 'org.apache.paimon.hive.PaimonStorageHandler'
+LOCATION
'oss://${hiveconf:oss_bucket}/regression/paimon_warehouse/ali_db.db/hive_test_table';
diff --git
a/fe/fe-core/src/main/java/org/apache/doris/datasource/CatalogProperty.java
b/fe/fe-core/src/main/java/org/apache/doris/datasource/CatalogProperty.java
index 0f118251197..dd5e8d641f1 100644
--- a/fe/fe-core/src/main/java/org/apache/doris/datasource/CatalogProperty.java
+++ b/fe/fe-core/src/main/java/org/apache/doris/datasource/CatalogProperty.java
@@ -18,6 +18,7 @@
package org.apache.doris.datasource;
import org.apache.doris.common.UserException;
+import org.apache.doris.common.security.authentication.AuthenticationConfig;
import org.apache.doris.datasource.property.metastore.MetastoreProperties;
import org.apache.doris.datasource.property.storage.StorageProperties;
@@ -25,8 +26,10 @@ import com.aliyun.odps.table.utils.Preconditions;
import com.google.common.collect.Maps;
import com.google.gson.annotations.SerializedName;
import org.apache.commons.collections4.MapUtils;
+import org.apache.commons.lang3.StringUtils;
import org.apache.commons.lang3.exception.ExceptionUtils;
import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.CommonConfigurationKeysPublic;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
@@ -287,6 +290,26 @@ public class CatalogProperty {
});
}
}
+ // Temporary compatibility: if the catalog uses HMS
kerberos auth, expose it as canonical
+ // hadoop.* kerberos properties so sys table JNI scanners
can reuse the same identity.
+ String hiveMetastoreAuthenticationType =
properties.get("hive.metastore.authentication.type");
+ String hiveMetastoreClientPrincipal =
properties.get("hive.metastore.client.principal");
+ String hiveMetastoreClientKeytab =
properties.get("hive.metastore.client.keytab");
+ if
("kerberos".equalsIgnoreCase(hiveMetastoreAuthenticationType)
+ &&
StringUtils.isNotBlank(hiveMetastoreClientPrincipal)
+ &&
StringUtils.isNotBlank(hiveMetastoreClientKeytab)) {
+
hadoopProperties.put(CommonConfigurationKeysPublic.HADOOP_SECURITY_AUTHENTICATION,
+ "kerberos");
+
hadoopProperties.put(AuthenticationConfig.HADOOP_KERBEROS_PRINCIPAL,
+ hiveMetastoreClientPrincipal);
+
hadoopProperties.put(AuthenticationConfig.HADOOP_KERBEROS_KEYTAB,
+ hiveMetastoreClientKeytab);
+ if (StringUtils.isNotBlank(
+
properties.get(AuthenticationConfig.HADOOP_SECURITY_AUTH_TO_LOCAL))) {
+
hadoopProperties.put(AuthenticationConfig.HADOOP_SECURITY_AUTH_TO_LOCAL,
+
properties.get(AuthenticationConfig.HADOOP_SECURITY_AUTH_TO_LOCAL));
+ }
+ }
}
}
}
diff --git
a/regression-test/data/external_table_p2/paimon/test_paimon_hms_catalog.out
b/regression-test/data/external_table_p2/paimon/test_paimon_hms_catalog.out
index c55e77b909c..1ec006b964f 100644
--- a/regression-test/data/external_table_p2/paimon/test_paimon_hms_catalog.out
+++ b/regression-test/data/external_table_p2/paimon/test_paimon_hms_catalog.out
@@ -17,6 +17,20 @@
-- !hdfs_new_kerberos --
11111111 hdfs_db_test
+-- !oss_hms_kerberos --
+11111111 ali_db_test
+11111111 ali_db_test
+11111111 ali_db_test
+11111111 ali_db_test
+11111111 ali_db_test
+
+-- !oss_hms_kerberos --
+11111111 ali_db_test
+11111111 ali_db_test
+11111111 ali_db_test
+11111111 ali_db_test
+11111111 ali_db_test
+
-- !oss --
11111111 ali_db_test
11111111 ali_db_test
diff --git
a/regression-test/suites/external_table_p2/paimon/test_paimon_hms_catalog.groovy
b/regression-test/suites/external_table_p2/paimon/test_paimon_hms_catalog.groovy
index 27df6cd5219..49f4945f60e 100644
---
a/regression-test/suites/external_table_p2/paimon/test_paimon_hms_catalog.groovy
+++
b/regression-test/suites/external_table_p2/paimon/test_paimon_hms_catalog.groovy
@@ -190,6 +190,7 @@ suite("test_paimon_hms_catalog",
"p2,external,paimon,new_catalog_property") {
testQuery(paimon_hms_catalog_properties + hdfs_warehouse_properties +
hdfs_storage_properties, "hdfs", "hdfs_db")
testQuery(paimon_hms_type_prop + hdfs_warehouse_properties +
hms_kerberos_new_prop + hdfs_kerberos_properties, "hdfs_kerberos", "hdfs_db")
testQuery(paimon_hms_type_prop + hdfs_warehouse_properties +
hms_kerberos_new_prop + hdfs_new_kerberos_properties, "hdfs_new_kerberos",
"hdfs_db")
+ testQuery(paimon_hms_type_prop + hms_kerberos_new_prop +
oss_warehouse_properties + oss_storage_properties, "oss_hms_kerberos", "ali_db")
testQuery(paimon_hms_catalog_properties + oss_warehouse_properties +
oss_storage_properties, "oss", "ali_db")
testQuery(paimon_hms_catalog_properties + obs_warehouse_properties +
obs_storage_properties, "obs", "hw_db")
testQuery(paimon_hms_catalog_properties + cos_warehouse_properties +
cos_storage_properties, "cos", "tx_db")
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]