peacewong commented on code in PR #4767: URL: https://github.com/apache/linkis/pull/4767#discussion_r1267464066
########## linkis-engineconn-plugins/spark/src/main/resources/spark-k8s-operator.md: ########## @@ -0,0 +1,98 @@ + +### 1. spark-on-k8s-operator document + +```text +https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/master/docs/quick-start-guide.md +``` + + +### 2. spark-on-k8s-operator install + +```text +helm repo add spark-operator https://googlecloudplatform.github.io/spark-on-k8s-operator + +helm install my-release spark-operator/spark-operator --namespace spark-operator --create-namespace --set webhook.enable=true +``` + +### 3. spark-on-k8s-operator test task submit + +```text +kubectl apply -f examples/spark-pi.yaml +``` + +### 4. If an error is reported: Message: Forbidden!Configured service account doesn't have access. Service account may have been revoked. pods "spark-pi-driver" is forbidden: error looking up service account spark/spark: serviceaccount "spark" not found. + +```text +kubectl create serviceaccount spark + +kubectl create clusterrolebinding spark-role --clusterrole=edit --serviceaccount=default:spark --namespace=default +``` + +### 5. spark-on-k8s-operator Uninstall (usually not required, uninstall after installation problems) + +```text +helm uninstall my-release --namespace spark-operator + +kubectl delete serviceaccounts my-release-spark-operator --namespace spark-operator + +kubectl delete clusterrole my-release-spark-operator --namespace spark-operator + +kubectl delete clusterrolebindings my-release-spark-operator --namespace spark-operator +``` + +### 6. Submitting tasks with Restful API +```text +POST /api/rest_j/v1/entrance/submit Review Comment: need to add linkis-cli demo ########## linkis-engineconn-plugins/spark/src/main/scala/org/apache/linkis/engineplugin/spark/factory/SparkEngineConnFactory.scala: ########## @@ -95,6 +95,17 @@ class SparkEngineConnFactory extends MultiExecutorEngineConnFactory with Logging sparkConfig.setJavaHome(variable(Environment.JAVA_HOME)) sparkConfig.setSparkHome(SPARK_HOME.getValue(options)) sparkConfig.setMaster(SPARK_MASTER.getValue(options)) + sparkConfig.setK8sConfigFile(SPARK_K8S_CONFIG_FILE.getValue(options)) Review Comment: Is there a need to judge Yarn or k8s here? ########## linkis-engineconn-plugins/spark/src/main/scala/org/apache/linkis/engineplugin/spark/config/SparkConfiguration.scala: ########## @@ -50,6 +50,18 @@ object SparkConfiguration extends Logging { val SPARK_APP_RESOURCE = CommonVars[String]("spark.app.resource", "") val SPARK_APP_CONF = CommonVars[String]("spark.extconf", "") + val SPARK_K8S_CONFIG_FILE = CommonVars[String]("spark.k8s.config.file", "") Review Comment: If it is not a native parameter of spark or k8s, it need to add links prefix -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: notifications-unsubscr...@linkis.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: notifications-unsubscr...@linkis.apache.org For additional commands, e-mail: notifications-h...@linkis.apache.org