My company has been exploring the Google Spark Operator for running Spark
jobs on a Kubernetes cluster, but we've found lots of limitations and
problems, and the product seems weakly supported.

Is there any official Apache option, or plans for such an option, to run
Spark jobs on Kubernetes? Is there perhaps an official Apache Spark
Operator in the works?

We currently run jobs on both Databricks and on Amazon EMR, but it would be
nice to have a good option for running Spark directly on our Kubernetes
clusters.

thanks :)

Reply via email to