This is an automated email from the ASF dual-hosted git repository.
diwu pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/doris.git
The following commit(s) were added to refs/heads/master by this push:
new 805673dd537 [typo](doc)Add spark load faq (#28015)
805673dd537 is described below
commit 805673dd53763dc9056d1ef5d074c416d4ff2e2f
Author: caoliang-web <[email protected]>
AuthorDate: Wed Dec 13 09:45:15 2023 +0800
[typo](doc)Add spark load faq (#28015)
---
docs/en/docs/data-operate/import/import-way/spark-load-manual.md | 2 +-
docs/zh-CN/docs/data-operate/import/import-way/spark-load-manual.md | 2 +-
2 files changed, 2 insertions(+), 2 deletions(-)
diff --git a/docs/en/docs/data-operate/import/import-way/spark-load-manual.md
b/docs/en/docs/data-operate/import/import-way/spark-load-manual.md
index 37c60a0b4a8..c3c9a7dbb0f 100644
--- a/docs/en/docs/data-operate/import/import-way/spark-load-manual.md
+++ b/docs/en/docs/data-operate/import/import-way/spark-load-manual.md
@@ -785,7 +785,7 @@ The most suitable scenario to use spark load is that the
raw data is in the file
If the `JAVA_HOME` environment variable is not set, the error `yarn
application kill failed. app id: xxx, load job id: xxx, msg: which: no
xxx/lib/yarn-client/hadoop/bin/yarn in ((null)) Error: JAVA_HOME is not set
and could not be found` will be reported.
-* When using spark load, the launch log for `SparkLauncher` is not printed.
+* When using spark load, the launch log for `SparkLauncher` is not printed or
report an error `start spark app failed. error: Waiting too much time to get
appId from handle. spark app state: UNKNOWN, loadJobId:xxx`
In `<`SPARK_HOME`>`/conf, add the log4j.properties configuration file and
set the log level to INFO.
diff --git
a/docs/zh-CN/docs/data-operate/import/import-way/spark-load-manual.md
b/docs/zh-CN/docs/data-operate/import/import-way/spark-load-manual.md
index 2989318eccd..1a4968c016a 100644
--- a/docs/zh-CN/docs/data-operate/import/import-way/spark-load-manual.md
+++ b/docs/zh-CN/docs/data-operate/import/import-way/spark-load-manual.md
@@ -748,7 +748,7 @@ LoadFinishTime: 2019-07-27 11:50:16
如果 `JAVA_HOME` 环境变量没有设置,会报 `yarn application kill failed. app id: xxx, load
job id: xxx, msg: which: no xxx/lib/yarn-client/hadoop/bin/yarn in ((null))
Error: JAVA_HOME is not set and could not be found` 错误
-- 使用 Spark Load 时没有打印 SparkLauncher 的启动日志。
+- 使用 Spark Load 时没有打印 SparkLauncher 的启动日志或者报错`start spark app failed. error:
Waiting too much time to get appId from handle. spark app state: UNKNOWN,
loadJobId:xxx`
在`<`SPARK_HOME`>`/conf下,添加log4j.properties配置文件,并配置日志级别为INFO。
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]