This is an automated email from the ASF dual-hosted git repository.

morningman pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/doris-website.git


The following commit(s) were added to refs/heads/master by this push:
     new e06fd4d4598 [doc](catalog) add some faq (#617)
e06fd4d4598 is described below

commit e06fd4d4598f95f1fdac7b1a4527586c46abb477
Author: Mingyu Chen <[email protected]>
AuthorDate: Mon May 6 22:11:28 2024 +0800

    [doc](catalog) add some faq (#617)
---
 .github/workflows/build-check.yml                  |  2 +-
 docs/faq/lakehouse-faq.md                          | 31 +++++++++++++++++++---
 .../current/faq/lakehouse-faq.md                   | 30 ++++++++++++++++++---
 .../lakehouse/datalake-building/hive-build.md      |  2 +-
 .../version-2.0/faq/lakehouse-faq.md               | 21 +++++++++++++++
 .../version-2.1/faq/lakehouse-faq.md               | 24 ++++++++++++++---
 .../lakehouse/datalake-building/hive-build.md      |  2 +-
 versioned_docs/version-2.0/faq/lakehouse-faq.md    | 25 ++++++++++++++---
 versioned_docs/version-2.1/faq/lakehouse-faq.md    | 24 ++++++++++++++---
 9 files changed, 141 insertions(+), 20 deletions(-)

diff --git a/.github/workflows/build-check.yml 
b/.github/workflows/build-check.yml
index ffd33dc3744..7c877c2e3c8 100644
--- a/.github/workflows/build-check.yml
+++ b/.github/workflows/build-check.yml
@@ -27,7 +27,7 @@ concurrency:
 
 jobs:
     build-and-deploy:
-        name: BuildDeploy
+        name: Build Check
         runs-on: ubuntu-latest
         environment: Production
         steps:
diff --git a/docs/faq/lakehouse-faq.md b/docs/faq/lakehouse-faq.md
index e39cdc49d6a..088f9008150 100644
--- a/docs/faq/lakehouse-faq.md
+++ b/docs/faq/lakehouse-faq.md
@@ -24,9 +24,6 @@ specific language governing permissions and limitations
 under the License.
 -->
 
-
-# FAQ
-
 ## Certificates
 
 1. If an error is reported: `curl 77: Problem with the SSL CA cert.`, need 
update your certificate.
@@ -42,7 +39,6 @@ ln -s /etc/pki/ca-trust/extracted/openssl/ca-bundle.trust.crt 
/etc/ssl/certs/ca-
 
 ## Kerberos
 
-
 1. What to do with the `GSS initiate failed` error when connecting to Hive 
Metastore with Kerberos authentication?
 
    Usually it is caused by incorrect Kerberos authentication information, you 
can troubleshoot by the following steps:
@@ -107,6 +103,12 @@ ln -s 
/etc/pki/ca-trust/extracted/openssl/ca-bundle.trust.crt /etc/ssl/certs/ca-
 
 8. When using Kerberos configuration in the Catalog, the `hadoop.username` 
property cannot be appeared in Catalog properties.
 
+9. Use JDK 17 to access Kerberos.
+
+    If you use JDK 17 to run Doris and access the Kerberos service, you may 
experience inaccessibility due to the use of obsolete encryption algorithms. 
The `allow_weak_crypto=true` attribute needs to be added to krb5.conf. Or 
upgrade the Kerberos encryption algorithm.
+
+    See: https://seanjmullan.org/blog/2021/09/14/jdk17#kerberos 
+
 ## JDBC Catalog
 
 1. An error is reported when connecting to SQLServer through JDBC Catalog: 
`unable to find valid certification path to requested target`
@@ -251,6 +253,27 @@ ln -s 
/etc/pki/ca-trust/extracted/openssl/ca-bundle.trust.crt /etc/ssl/certs/ca-
     
     Try to update FE node CA certificates, use command `update-ca-trust 
(CentOS/RockyLinux)`, then restart FE process.
 
+11. BE report error: `java.lang.InternalError`
+
+    If you see error in `be.INFO` like:
+
+    ```
+    W20240506 15:19:57.553396 266457 jni-util.cpp:259] java.lang.InternalError
+            at org.apache.hadoop.io.compress.zlib.ZlibDecompressor.init(Native 
Method)
+            at 
org.apache.hadoop.io.compress.zlib.ZlibDecompressor.<init>(ZlibDecompressor.java:114)
+            at 
org.apache.hadoop.io.compress.GzipCodec$GzipZlibDecompressor.<init>(GzipCodec.java:229)
+            at 
org.apache.hadoop.io.compress.GzipCodec.createDecompressor(GzipCodec.java:188)
+            at 
org.apache.hadoop.io.compress.CodecPool.getDecompressor(CodecPool.java:183)
+            at 
org.apache.parquet.hadoop.CodecFactory$HeapBytesDecompressor.<init>(CodecFactory.java:99)
+            at 
org.apache.parquet.hadoop.CodecFactory.createDecompressor(CodecFactory.java:223)
+            at 
org.apache.parquet.hadoop.CodecFactory.getDecompressor(CodecFactory.java:212)
+            at 
org.apache.parquet.hadoop.CodecFactory.getDecompressor(CodecFactory.java:43)
+    ```
+
+    This is because the conflict between system libz.so and Doris' libz.a.
+
+    To solve it, execute `export 
LD_LIBRARY_PATH=/path/to/be/lib:$LD_LIBRARY_PATH` and restart BE.
+
 ## HDFS
 
 1. What to do with the`java.lang.VerifyError: xxx` error when accessing HDFS 
3.x?
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/faq/lakehouse-faq.md 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/faq/lakehouse-faq.md
index bfe24b20298..50e09e41cdc 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/faq/lakehouse-faq.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/faq/lakehouse-faq.md
@@ -24,9 +24,6 @@ specific language governing permissions and limitations
 under the License.
 -->
 
-
-# 常见问题
-
 ## 证书问题
 
 1. 查询时报错 `curl 77: Problem with the SSL CA cert.`。说明当前系统证书过旧,需要更新本地证书。
@@ -105,6 +102,12 @@ ln -s 
/etc/pki/ca-trust/extracted/openssl/ca-bundle.trust.crt /etc/ssl/certs/ca-
 
 8. 当在 Catalog 里使用 Kerberos 配置时,不能同时使用`hadoop.username`属性。
 
+9. 使用 JDK 17 访问 Kerberos
+
+    如果使用 JDK 17 运行 Doris 并访问 Kerberos 服务,可能会出现因使用已废弃的加密算法而导致无法访问的现象。需要在 
krb5.conf 中添加 `allow_weak_crypto=true` 属性。或升级 Kerberos 的加密算法。
+
+    详情参阅:https://seanjmullan.org/blog/2021/09/14/jdk17#kerberos 
+
 ## JDBC Catalog
 
 1. 通过 JDBC Catalog 连接 SQLServer 报错:`unable to find valid certification path to 
requested target`
@@ -244,6 +247,27 @@ ln -s 
/etc/pki/ca-trust/extracted/openssl/ca-bundle.trust.crt /etc/ssl/certs/ca-
 
     尝试更新FE节点CA证书,使用 `update-ca-trust(CentOS/RockyLinux)`,然后重启FE进程即可。
 
+11. BE 报错: `java.lang.InternalError`
+
+    如果在 `be.INFO` 中看到类似如下错误:
+
+    ```
+    W20240506 15:19:57.553396 266457 jni-util.cpp:259] java.lang.InternalError
+            at org.apache.hadoop.io.compress.zlib.ZlibDecompressor.init(Native 
Method)
+            at 
org.apache.hadoop.io.compress.zlib.ZlibDecompressor.<init>(ZlibDecompressor.java:114)
+            at 
org.apache.hadoop.io.compress.GzipCodec$GzipZlibDecompressor.<init>(GzipCodec.java:229)
+            at 
org.apache.hadoop.io.compress.GzipCodec.createDecompressor(GzipCodec.java:188)
+            at 
org.apache.hadoop.io.compress.CodecPool.getDecompressor(CodecPool.java:183)
+            at 
org.apache.parquet.hadoop.CodecFactory$HeapBytesDecompressor.<init>(CodecFactory.java:99)
+            at 
org.apache.parquet.hadoop.CodecFactory.createDecompressor(CodecFactory.java:223)
+            at 
org.apache.parquet.hadoop.CodecFactory.getDecompressor(CodecFactory.java:212)
+            at 
org.apache.parquet.hadoop.CodecFactory.getDecompressor(CodecFactory.java:43)
+    ```
+
+    是因为 Doris 自带的 libz.a 和系统环境中的 libz.so 冲突了。
+
+    为了解决这个问题,需要先执行 `export LD_LIBRARY_PATH=/path/to/be/lib:$LD_LIBRARY_PATH` 
然后重启 BE 进程。
+
 ## HDFS
 
 1. 访问 HDFS 3.x 时报错:`java.lang.VerifyError: xxx`
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/lakehouse/datalake-building/hive-build.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/lakehouse/datalake-building/hive-build.md
index 0648ed2abc4..94f905149e0 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/lakehouse/datalake-building/hive-build.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/lakehouse/datalake-building/hive-build.md
@@ -43,7 +43,7 @@ under the License.
     );
     ```
         
-    注意,如如果需要通过 Doris 创建 Hive 表或写入数据,需要在 Catalog 属性中显示增加 `fs.defaultFS` 属性。如果创建 
Catalog 仅用于查询,则该参数可以省略。
+    注意,如如果需要通过 Doris 创建 Hive 表或写入数据,需要在 Catalog 属性中显式增加 `fs.defaultFS` 属性。如果创建 
Catalog 仅用于查询,则该参数可以省略。
     
     更多参数,请参阅 [Hive Catalog](../datalake-analytics/hive.md)
 
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.0/faq/lakehouse-faq.md 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.0/faq/lakehouse-faq.md
index fc8966dc1d8..c6bdb9893ef 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.0/faq/lakehouse-faq.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.0/faq/lakehouse-faq.md
@@ -238,6 +238,27 @@ under the License.
 
     尝试更新FE节点CA证书,使用 `update-ca-trust(CentOS/RockyLinux)`,然后重启FE进程即可。
 
+11. BE 报错: `java.lang.InternalError`
+
+    如果在 `be.INFO` 中看到类似如下错误:
+
+    ```
+    W20240506 15:19:57.553396 266457 jni-util.cpp:259] java.lang.InternalError
+            at org.apache.hadoop.io.compress.zlib.ZlibDecompressor.init(Native 
Method)
+            at 
org.apache.hadoop.io.compress.zlib.ZlibDecompressor.<init>(ZlibDecompressor.java:114)
+            at 
org.apache.hadoop.io.compress.GzipCodec$GzipZlibDecompressor.<init>(GzipCodec.java:229)
+            at 
org.apache.hadoop.io.compress.GzipCodec.createDecompressor(GzipCodec.java:188)
+            at 
org.apache.hadoop.io.compress.CodecPool.getDecompressor(CodecPool.java:183)
+            at 
org.apache.parquet.hadoop.CodecFactory$HeapBytesDecompressor.<init>(CodecFactory.java:99)
+            at 
org.apache.parquet.hadoop.CodecFactory.createDecompressor(CodecFactory.java:223)
+            at 
org.apache.parquet.hadoop.CodecFactory.getDecompressor(CodecFactory.java:212)
+            at 
org.apache.parquet.hadoop.CodecFactory.getDecompressor(CodecFactory.java:43)
+    ```
+
+    是因为 Doris 自带的 libz.a 和系统环境中的 libz.so 冲突了。
+
+    为了解决这个问题,需要先执行 `export LD_LIBRARY_PATH=/path/to/be/lib:$LD_LIBRARY_PATH` 
然后重启 BE 进程。
+
 ## HDFS
 
 1. 访问 HDFS 3.x 时报错:`java.lang.VerifyError: xxx`
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/faq/lakehouse-faq.md 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/faq/lakehouse-faq.md
index bfe24b20298..66fe58474d2 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/faq/lakehouse-faq.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/faq/lakehouse-faq.md
@@ -24,9 +24,6 @@ specific language governing permissions and limitations
 under the License.
 -->
 
-
-# 常见问题
-
 ## 证书问题
 
 1. 查询时报错 `curl 77: Problem with the SSL CA cert.`。说明当前系统证书过旧,需要更新本地证书。
@@ -244,6 +241,27 @@ ln -s 
/etc/pki/ca-trust/extracted/openssl/ca-bundle.trust.crt /etc/ssl/certs/ca-
 
     尝试更新FE节点CA证书,使用 `update-ca-trust(CentOS/RockyLinux)`,然后重启FE进程即可。
 
+11. BE 报错: `java.lang.InternalError`
+
+    如果在 `be.INFO` 中看到类似如下错误:
+
+    ```
+    W20240506 15:19:57.553396 266457 jni-util.cpp:259] java.lang.InternalError
+            at org.apache.hadoop.io.compress.zlib.ZlibDecompressor.init(Native 
Method)
+            at 
org.apache.hadoop.io.compress.zlib.ZlibDecompressor.<init>(ZlibDecompressor.java:114)
+            at 
org.apache.hadoop.io.compress.GzipCodec$GzipZlibDecompressor.<init>(GzipCodec.java:229)
+            at 
org.apache.hadoop.io.compress.GzipCodec.createDecompressor(GzipCodec.java:188)
+            at 
org.apache.hadoop.io.compress.CodecPool.getDecompressor(CodecPool.java:183)
+            at 
org.apache.parquet.hadoop.CodecFactory$HeapBytesDecompressor.<init>(CodecFactory.java:99)
+            at 
org.apache.parquet.hadoop.CodecFactory.createDecompressor(CodecFactory.java:223)
+            at 
org.apache.parquet.hadoop.CodecFactory.getDecompressor(CodecFactory.java:212)
+            at 
org.apache.parquet.hadoop.CodecFactory.getDecompressor(CodecFactory.java:43)
+    ```
+
+    是因为 Doris 自带的 libz.a 和系统环境中的 libz.so 冲突了。
+
+    为了解决这个问题,需要先执行 `export LD_LIBRARY_PATH=/path/to/be/lib:$LD_LIBRARY_PATH` 
然后重启 BE 进程。
+
 ## HDFS
 
 1. 访问 HDFS 3.x 时报错:`java.lang.VerifyError: xxx`
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/lakehouse/datalake-building/hive-build.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/lakehouse/datalake-building/hive-build.md
index 0648ed2abc4..94f905149e0 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/lakehouse/datalake-building/hive-build.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/lakehouse/datalake-building/hive-build.md
@@ -43,7 +43,7 @@ under the License.
     );
     ```
         
-    注意,如如果需要通过 Doris 创建 Hive 表或写入数据,需要在 Catalog 属性中显示增加 `fs.defaultFS` 属性。如果创建 
Catalog 仅用于查询,则该参数可以省略。
+    注意,如如果需要通过 Doris 创建 Hive 表或写入数据,需要在 Catalog 属性中显式增加 `fs.defaultFS` 属性。如果创建 
Catalog 仅用于查询,则该参数可以省略。
     
     更多参数,请参阅 [Hive Catalog](../datalake-analytics/hive.md)
 
diff --git a/versioned_docs/version-2.0/faq/lakehouse-faq.md 
b/versioned_docs/version-2.0/faq/lakehouse-faq.md
index 130f3b1ff33..b04d7b61e2d 100644
--- a/versioned_docs/version-2.0/faq/lakehouse-faq.md
+++ b/versioned_docs/version-2.0/faq/lakehouse-faq.md
@@ -24,12 +24,8 @@ specific language governing permissions and limitations
 under the License.
 -->
 
-
-# FAQ
-
 ## Kerberos
 
-
 1. What to do with the `GSS initiate failed` error when connecting to Hive 
Metastore with Kerberos authentication?
 
    Usually it is caused by incorrect Kerberos authentication information, you 
can troubleshoot by the following steps:
@@ -233,6 +229,27 @@ under the License.
     
     Try to update FE node CA certificates, use command `update-ca-trust 
(CentOS/RockyLinux)`, then restart FE process.
 
+11. BE report error: `java.lang.InternalError`
+
+    If you see error in `be.INFO` like:
+
+    ```
+    W20240506 15:19:57.553396 266457 jni-util.cpp:259] java.lang.InternalError
+            at org.apache.hadoop.io.compress.zlib.ZlibDecompressor.init(Native 
Method)
+            at 
org.apache.hadoop.io.compress.zlib.ZlibDecompressor.<init>(ZlibDecompressor.java:114)
+            at 
org.apache.hadoop.io.compress.GzipCodec$GzipZlibDecompressor.<init>(GzipCodec.java:229)
+            at 
org.apache.hadoop.io.compress.GzipCodec.createDecompressor(GzipCodec.java:188)
+            at 
org.apache.hadoop.io.compress.CodecPool.getDecompressor(CodecPool.java:183)
+            at 
org.apache.parquet.hadoop.CodecFactory$HeapBytesDecompressor.<init>(CodecFactory.java:99)
+            at 
org.apache.parquet.hadoop.CodecFactory.createDecompressor(CodecFactory.java:223)
+            at 
org.apache.parquet.hadoop.CodecFactory.getDecompressor(CodecFactory.java:212)
+            at 
org.apache.parquet.hadoop.CodecFactory.getDecompressor(CodecFactory.java:43)
+    ```
+
+    This is because the conflict between system libz.so and Doris' libz.a.
+
+    To solve it, execute `export 
LD_LIBRARY_PATH=/path/to/be/lib:$LD_LIBRARY_PATH` and restart BE.
+
 ## HDFS
 
 1. What to do with the`java.lang.VerifyError: xxx` error when accessing HDFS 
3.x?
diff --git a/versioned_docs/version-2.1/faq/lakehouse-faq.md 
b/versioned_docs/version-2.1/faq/lakehouse-faq.md
index e39cdc49d6a..e86467b9d68 100644
--- a/versioned_docs/version-2.1/faq/lakehouse-faq.md
+++ b/versioned_docs/version-2.1/faq/lakehouse-faq.md
@@ -24,9 +24,6 @@ specific language governing permissions and limitations
 under the License.
 -->
 
-
-# FAQ
-
 ## Certificates
 
 1. If an error is reported: `curl 77: Problem with the SSL CA cert.`, need 
update your certificate.
@@ -251,6 +248,27 @@ ln -s 
/etc/pki/ca-trust/extracted/openssl/ca-bundle.trust.crt /etc/ssl/certs/ca-
     
     Try to update FE node CA certificates, use command `update-ca-trust 
(CentOS/RockyLinux)`, then restart FE process.
 
+11. BE report error: `java.lang.InternalError`
+
+    If you see error in `be.INFO` like:
+
+    ```
+    W20240506 15:19:57.553396 266457 jni-util.cpp:259] java.lang.InternalError
+            at org.apache.hadoop.io.compress.zlib.ZlibDecompressor.init(Native 
Method)
+            at 
org.apache.hadoop.io.compress.zlib.ZlibDecompressor.<init>(ZlibDecompressor.java:114)
+            at 
org.apache.hadoop.io.compress.GzipCodec$GzipZlibDecompressor.<init>(GzipCodec.java:229)
+            at 
org.apache.hadoop.io.compress.GzipCodec.createDecompressor(GzipCodec.java:188)
+            at 
org.apache.hadoop.io.compress.CodecPool.getDecompressor(CodecPool.java:183)
+            at 
org.apache.parquet.hadoop.CodecFactory$HeapBytesDecompressor.<init>(CodecFactory.java:99)
+            at 
org.apache.parquet.hadoop.CodecFactory.createDecompressor(CodecFactory.java:223)
+            at 
org.apache.parquet.hadoop.CodecFactory.getDecompressor(CodecFactory.java:212)
+            at 
org.apache.parquet.hadoop.CodecFactory.getDecompressor(CodecFactory.java:43)
+    ```
+
+    This is because the conflict between system libz.so and Doris' libz.a.
+
+    To solve it, execute `export 
LD_LIBRARY_PATH=/path/to/be/lib:$LD_LIBRARY_PATH` and restart BE.
+
 ## HDFS
 
 1. What to do with the`java.lang.VerifyError: xxx` error when accessing HDFS 
3.x?


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to