This is an automated email from the ASF dual-hosted git repository.
chengpan pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/kyuubi.git
The following commit(s) were added to refs/heads/master by this push:
new 9fefd47e1 [KYUUBI #5934] [K8S][HELM] Add Spark configuration support
9fefd47e1 is described below
commit 9fefd47e1ff197b3ea053afe78c7e0ef8111be59
Author: dnskr <[email protected]>
AuthorDate: Wed Jan 3 10:14:59 2024 +0800
[KYUUBI #5934] [K8S][HELM] Add Spark configuration support
# :mag: Description
## Issue References ๐
The PR adds basic support for configuration files to be used by Apache
Spark query engine.
Relates to
https://github.com/apache/kyuubi/issues/4629#issuecomment-1489570556.
## Describe Your Solution ๐ง
The PR adds:
- Apache Spark related `ConfigMap` that has to be mounted to Kyuubi pods
as files.
- `KYUUBI_CONF_DIR` and `SPARK_CONF_DIR` environment variables.
The `ConfigMap` change:
- does not require Kyuubi pods to be restarted, because related files
content has to be changed eventually.
- does not affect already running Spark applications.
## Types of changes :bookmark:
- [ ] Bugfix (non-breaking change which fixes an issue)
- [x] New feature (non-breaking change which adds functionality)
- [ ] Breaking change (fix or feature that would cause existing
functionality to change)
## Test Plan ๐งช
#### Behavior Without This Pull Request :coffin:
Users can configure default Spark properties with `volumes`, `volumeMounts`
and other general chart properties.
The approach causes Kyuubi pod restarts.
#### Behavior With This Pull Request :tada:
Users can configure default Spark properties with `sparkConf.sparkEnv`,
`sparkConf.sparkDefaults`, `sparkConf.log4j2` and `sparkConf.metrics`.
The approach does not require Kyuubi pod restarts.
#### Related Unit Tests
N/A
---
# Checklist ๐
- [x] This patch was not authored or co-authored using [Generative
Tooling](https://www.apache.org/legal/generative-tooling.html)
**Be nice. Be informative.**
Closes #5934 from dnskr/helm-spark-configs.
Closes #5934
3f224e4cf [dnskr] [K8S][HELM] Add Spark configuration support
Authored-by: dnskr <[email protected]>
Signed-off-by: Cheng Pan <[email protected]>
---
charts/kyuubi/templates/kyuubi-configmap.yaml | 1 -
.../kyuubi/templates/kyuubi-spark-configmap.yaml | 40 ++++++++++++++++++++
charts/kyuubi/templates/kyuubi-statefulset.yaml | 16 ++++++--
charts/kyuubi/values.yaml | 43 +++++++++++++++++++++-
4 files changed, 95 insertions(+), 5 deletions(-)
diff --git a/charts/kyuubi/templates/kyuubi-configmap.yaml
b/charts/kyuubi/templates/kyuubi-configmap.yaml
index 62413567d..0f838857e 100644
--- a/charts/kyuubi/templates/kyuubi-configmap.yaml
+++ b/charts/kyuubi/templates/kyuubi-configmap.yaml
@@ -24,7 +24,6 @@ metadata:
data:
{{- with .Values.kyuubiConf.kyuubiEnv }}
kyuubi-env.sh: |
- #!/usr/bin/env bash
{{- tpl . $ | nindent 4 }}
{{- end }}
kyuubi-defaults.conf: |
diff --git a/charts/kyuubi/templates/kyuubi-spark-configmap.yaml
b/charts/kyuubi/templates/kyuubi-spark-configmap.yaml
new file mode 100644
index 000000000..5794c429f
--- /dev/null
+++ b/charts/kyuubi/templates/kyuubi-spark-configmap.yaml
@@ -0,0 +1,40 @@
+{{/*
+ Licensed to the Apache Software Foundation (ASF) under one or more
+ contributor license agreements. See the NOTICE file distributed with
+ this work for additional information regarding copyright ownership.
+ The ASF licenses this file to You under the Apache License, Version 2.0
+ (the "License"); you may not use this file except in compliance with
+ the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+*/}}
+
+apiVersion: v1
+kind: ConfigMap
+metadata:
+ name: {{ .Release.Name }}-spark
+ labels:
+ {{- include "kyuubi.labels" . | nindent 4 }}
+data:
+ {{- with .Values.sparkConf.sparkEnv }}
+ spark-env.sh: |
+ {{- tpl . $ | nindent 4 }}
+ {{- end }}
+ {{- with .Values.sparkConf.sparkDefaults }}
+ spark-defaults.conf: |
+ {{- tpl . $ | nindent 4 }}
+ {{- end }}
+ {{- with .Values.sparkConf.log4j2 }}
+ log4j2.properties: |
+ {{- tpl . $ | nindent 4 }}
+ {{- end }}
+ {{- with .Values.sparkConf.metrics }}
+ metrics.properties: |
+ {{- tpl . $ | nindent 4 }}
+ {{- end }}
diff --git a/charts/kyuubi/templates/kyuubi-statefulset.yaml
b/charts/kyuubi/templates/kyuubi-statefulset.yaml
index 309ef8ec9..a79b5be9a 100644
--- a/charts/kyuubi/templates/kyuubi-statefulset.yaml
+++ b/charts/kyuubi/templates/kyuubi-statefulset.yaml
@@ -62,9 +62,14 @@ spec:
{{- with .Values.args }}
args: {{- tpl (toYaml .) $ | nindent 12 }}
{{- end }}
- {{- with .Values.env }}
- env: {{- tpl (toYaml .) $ | nindent 12 }}
- {{- end }}
+ env:
+ - name: KYUUBI_CONF_DIR
+ value: {{ .Values.kyuubiConfDir }}
+ - name: SPARK_CONF_DIR
+ value: {{ .Values.sparkConfDir }}
+ {{- with .Values.env }}
+ {{- tpl (toYaml .) $ | nindent 12 }}
+ {{- end }}
{{- with .Values.envFrom }}
envFrom: {{- tpl (toYaml .) $ | nindent 12 }}
{{- end }}
@@ -105,6 +110,8 @@ spec:
volumeMounts:
- name: conf
mountPath: {{ .Values.kyuubiConfDir }}
+ - name: conf-spark
+ mountPath: {{ .Values.sparkConfDir }}
{{- with .Values.volumeMounts }}
{{- tpl (toYaml .) $ | nindent 12 }}
{{- end }}
@@ -115,6 +122,9 @@ spec:
- name: conf
configMap:
name: {{ .Release.Name }}
+ - name: conf-spark
+ configMap:
+ name: {{ .Release.Name }}-spark
{{- with .Values.volumes }}
{{- tpl (toYaml .) $ | nindent 8 }}
{{- end }}
diff --git a/charts/kyuubi/values.yaml b/charts/kyuubi/values.yaml
index 044668040..31d802fd4 100644
--- a/charts/kyuubi/values.yaml
+++ b/charts/kyuubi/values.yaml
@@ -152,12 +152,13 @@ monitoring:
# $KYUUBI_CONF_DIR directory
kyuubiConfDir: /opt/kyuubi/conf
-# Kyuubi configurations files
+# Kyuubi configuration files
kyuubiConf:
# The value (templated string) is used for kyuubi-env.sh file
# See example at conf/kyuubi-env.sh.template and
https://kyuubi.readthedocs.io/en/master/configuration/settings.html#environments
for more details
kyuubiEnv: ~
# kyuubiEnv: |
+ # #!/usr/bin/env bash
# export JAVA_HOME=/usr/jdk64/jdk1.8.0_152
# export SPARK_HOME=/opt/spark
# export FLINK_HOME=/opt/flink
@@ -179,6 +180,46 @@ kyuubiConf:
# See example at conf/log4j2.xml.template
https://kyuubi.readthedocs.io/en/master/configuration/settings.html#logging for
more details
log4j2: ~
+# $SPARK_CONF_DIR directory
+sparkConfDir: /opt/spark/conf
+# Spark configuration files
+sparkConf:
+ # The value (templated string) is used for spark-env.sh file
+ # See example at
https://github.com/apache/spark/blob/master/conf/spark-env.sh.template and
Spark documentation for more details
+ sparkEnv: ~
+ # sparkEnv: |
+ # #!/usr/bin/env bash
+ # export JAVA_HOME=/usr/jdk64/jdk1.8.0_152
+ # export SPARK_LOG_DIR=/opt/spark/logs
+ # export SPARK_LOG_MAX_FILES=5
+
+ # The value (templated string) is used for spark-defaults.conf file
+ # See example at
https://github.com/apache/spark/blob/master/conf/spark-defaults.conf.template
and Spark documentation for more details
+ sparkDefaults: ~
+ # sparkDefaults: |
+ # spark.submit.deployMode=cluster
+ # spark.kubernetes.container.image=apache/spark:3.5.0
+ # spark.kubernetes.authenticate.driver.serviceAccountName=spark
+ # spark.kubernetes.file.upload.path=s3a://kyuubi/spark
+ # # S3 dependencies
+ #
spark.jars.packages=org.apache.hadoop:hadoop-aws:3.3.4,com.amazonaws:aws-java-sdk-bundle:1.12.262
+ # spark.driver.extraJavaOptions=-Divy.cache.dir=/tmp -Divy.home=/tmp
+ # # S3A configuration
+ # spark.hadoop.fs.s3a.impl=org.apache.hadoop.fs.s3a.S3AFileSystem
+ # spark.hadoop.fs.s3a.endpoint=http://object-storage:80
+ # spark.hadoop.fs.s3a.access.key=******
+ # spark.hadoop.fs.s3a.secret.key=********
+ # spark.hadoop.fs.s3a.path.style.access=true
+ # spark.hadoop.fs.s3a.fast.upload=true
+
+ # The value (templated string) is used for log4j2.properties file
+ # See example at
https://github.com/apache/spark/blob/master/conf/log4j2.properties.template and
Spark documentation for more details
+ log4j2: ~
+
+ # The value (templated string) is used for metrics.properties file
+ # See example at
https://github.com/apache/spark/blob/master/conf/metrics.properties.template
and Spark documentation for more details
+ metrics: ~
+
# Command to launch Kyuubi server (templated)
command: ~
# Arguments to launch Kyuubi server (templated)