dongjoon-hyun commented on code in PR #2:
URL: 
https://github.com/apache/spark-kubernetes-operator/pull/2#discussion_r1551965460


##########
spark-operator-docs/spark_application.md:
##########
@@ -0,0 +1,212 @@
+## Spark Application API
+
+The core user facing API of the Spark Kubernetes Operator is the 
SparkApplication Custom 
+Resources Definition (CRD). Spark Application CustomResource extends standard 
k8s API, 
+defines Spark Application spec and tracks status.  
+
+Once the Spark Operator is installed and running in your Kubernetes 
environment, it will 
+continuously watch SparkApplication(s) submitted, via k8s API client or 
kubectl by the user,
+orchestrate secondary resources (pods, configmaps .etc). 
+
+Please check out the [quickstart](getting_started.md) as well for installing 
operator.
+
+## SparkApplication
+
+SparkApplication can be defined in YAML format and with bare minimal required 
fields in 
+order to start:
+
+```
+apiVersion: org.apache.spark/v1alpha1
+kind: SparkApplication
+metadata:
+  name: spark-pi
+  namespace: spark-test
+spec:
+  mainClass: "org.apache.spark.examples.SparkPi"
+  jars: "local:///opt/spark/examples/jars/spark-examples.jar"
+  sparkConf:
+    spark.executor.instances: "5"
+    spark.kubernetes.container.image: 
"spark:3.4.1-scala2.12-java11-python3-r-ubuntu"

Review Comment:
   Please use 3.5.1.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to